Mar 19 16:39:46 crc systemd[1]: Starting Kubernetes Kubelet... Mar 19 16:39:46 crc restorecon[4680]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:47 crc restorecon[4680]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 16:39:47 crc restorecon[4680]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 19 16:39:48 crc kubenswrapper[4918]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 16:39:48 crc kubenswrapper[4918]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 16:39:48 crc kubenswrapper[4918]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 16:39:48 crc kubenswrapper[4918]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 16:39:48 crc kubenswrapper[4918]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 16:39:48 crc kubenswrapper[4918]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.300447 4918 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306369 4918 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306391 4918 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306399 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306404 4918 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306409 4918 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306414 4918 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306420 4918 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306425 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306430 4918 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306434 4918 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306440 4918 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306444 4918 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306449 4918 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306454 4918 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306459 4918 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306464 4918 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306477 4918 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306482 4918 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306487 4918 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306492 4918 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306497 4918 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306502 4918 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306507 4918 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306511 4918 feature_gate.go:330] unrecognized feature gate: Example Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306539 4918 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306545 4918 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306550 4918 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306555 4918 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306559 4918 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306564 4918 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306569 4918 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306573 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306578 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306583 4918 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306588 4918 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306593 4918 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306597 4918 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306602 4918 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306607 4918 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306611 4918 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306616 4918 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306621 4918 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306636 4918 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306642 4918 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306646 4918 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306651 4918 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306656 4918 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306660 4918 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306665 4918 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306670 4918 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306674 4918 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306679 4918 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306684 4918 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306688 4918 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306693 4918 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306698 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306703 4918 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306708 4918 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306713 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306718 4918 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306725 4918 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306732 4918 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306739 4918 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306745 4918 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306750 4918 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306756 4918 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306762 4918 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306769 4918 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306775 4918 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306781 4918 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.306787 4918 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306890 4918 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306903 4918 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306915 4918 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306924 4918 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306931 4918 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306937 4918 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306945 4918 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306959 4918 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306966 4918 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306972 4918 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306978 4918 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306984 4918 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306990 4918 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.306996 4918 flags.go:64] FLAG: --cgroup-root="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307002 4918 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307009 4918 flags.go:64] FLAG: --client-ca-file="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307015 4918 flags.go:64] FLAG: --cloud-config="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307020 4918 flags.go:64] FLAG: --cloud-provider="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307026 4918 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307037 4918 flags.go:64] FLAG: --cluster-domain="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307043 4918 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307048 4918 flags.go:64] FLAG: --config-dir="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307054 4918 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307061 4918 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307069 4918 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307075 4918 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307081 4918 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307088 4918 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307094 4918 flags.go:64] FLAG: --contention-profiling="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307100 4918 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307105 4918 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307111 4918 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307117 4918 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307125 4918 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307134 4918 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307140 4918 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307146 4918 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307151 4918 flags.go:64] FLAG: --enable-server="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307156 4918 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307168 4918 flags.go:64] FLAG: --event-burst="100" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307174 4918 flags.go:64] FLAG: --event-qps="50" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307180 4918 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307186 4918 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307200 4918 flags.go:64] FLAG: --eviction-hard="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307207 4918 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307213 4918 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307219 4918 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307225 4918 flags.go:64] FLAG: --eviction-soft="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307230 4918 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307236 4918 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307242 4918 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307248 4918 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307254 4918 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307260 4918 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307266 4918 flags.go:64] FLAG: --feature-gates="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307273 4918 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307279 4918 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307285 4918 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307290 4918 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307296 4918 flags.go:64] FLAG: --healthz-port="10248" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307302 4918 flags.go:64] FLAG: --help="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307308 4918 flags.go:64] FLAG: --hostname-override="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307313 4918 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307319 4918 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307325 4918 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307331 4918 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307337 4918 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307343 4918 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307349 4918 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307355 4918 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307360 4918 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307366 4918 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307372 4918 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307378 4918 flags.go:64] FLAG: --kube-reserved="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307383 4918 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307389 4918 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307395 4918 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307401 4918 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307406 4918 flags.go:64] FLAG: --lock-file="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307419 4918 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307425 4918 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307431 4918 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307440 4918 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307446 4918 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307452 4918 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307458 4918 flags.go:64] FLAG: --logging-format="text" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307463 4918 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307469 4918 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307474 4918 flags.go:64] FLAG: --manifest-url="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307480 4918 flags.go:64] FLAG: --manifest-url-header="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307487 4918 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307493 4918 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307500 4918 flags.go:64] FLAG: --max-pods="110" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307505 4918 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307511 4918 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307543 4918 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307549 4918 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307555 4918 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307561 4918 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307566 4918 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307579 4918 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307585 4918 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307591 4918 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307598 4918 flags.go:64] FLAG: --pod-cidr="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307603 4918 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307614 4918 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307619 4918 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307626 4918 flags.go:64] FLAG: --pods-per-core="0" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307632 4918 flags.go:64] FLAG: --port="10250" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307637 4918 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307643 4918 flags.go:64] FLAG: --provider-id="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307649 4918 flags.go:64] FLAG: --qos-reserved="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307654 4918 flags.go:64] FLAG: --read-only-port="10255" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307660 4918 flags.go:64] FLAG: --register-node="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307665 4918 flags.go:64] FLAG: --register-schedulable="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307680 4918 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307695 4918 flags.go:64] FLAG: --registry-burst="10" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307700 4918 flags.go:64] FLAG: --registry-qps="5" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307706 4918 flags.go:64] FLAG: --reserved-cpus="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307711 4918 flags.go:64] FLAG: --reserved-memory="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307718 4918 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307724 4918 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307729 4918 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307735 4918 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307740 4918 flags.go:64] FLAG: --runonce="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307746 4918 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307752 4918 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307757 4918 flags.go:64] FLAG: --seccomp-default="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307763 4918 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307768 4918 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307774 4918 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307780 4918 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307786 4918 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307792 4918 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307798 4918 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307803 4918 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307809 4918 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307814 4918 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307820 4918 flags.go:64] FLAG: --system-cgroups="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307826 4918 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307834 4918 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307840 4918 flags.go:64] FLAG: --tls-cert-file="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307845 4918 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307858 4918 flags.go:64] FLAG: --tls-min-version="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307864 4918 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307869 4918 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307875 4918 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307881 4918 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307888 4918 flags.go:64] FLAG: --v="2" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307896 4918 flags.go:64] FLAG: --version="false" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307903 4918 flags.go:64] FLAG: --vmodule="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307921 4918 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.307928 4918 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308101 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308109 4918 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308114 4918 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308120 4918 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308125 4918 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308129 4918 feature_gate.go:330] unrecognized feature gate: Example Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308134 4918 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308139 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308144 4918 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308149 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308156 4918 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308162 4918 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308168 4918 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308173 4918 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308178 4918 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308183 4918 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308188 4918 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308192 4918 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308197 4918 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308202 4918 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308207 4918 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308211 4918 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308216 4918 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308221 4918 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308226 4918 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308231 4918 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308235 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308241 4918 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308248 4918 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308255 4918 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308260 4918 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308265 4918 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308279 4918 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308285 4918 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308292 4918 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308299 4918 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308305 4918 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308310 4918 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308316 4918 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308322 4918 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308327 4918 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308332 4918 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308338 4918 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308343 4918 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308348 4918 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308353 4918 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308359 4918 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308364 4918 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308370 4918 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308375 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308381 4918 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308386 4918 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308394 4918 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308399 4918 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308405 4918 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308410 4918 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308415 4918 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308421 4918 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308427 4918 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308432 4918 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308437 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308442 4918 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308447 4918 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308452 4918 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308456 4918 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308461 4918 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308466 4918 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308470 4918 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308476 4918 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308480 4918 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.308487 4918 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.309297 4918 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.321858 4918 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.321927 4918 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322104 4918 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322124 4918 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322134 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322143 4918 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322155 4918 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322165 4918 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322173 4918 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322184 4918 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322192 4918 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322201 4918 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322210 4918 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322221 4918 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322231 4918 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322240 4918 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322249 4918 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322258 4918 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322268 4918 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322279 4918 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322290 4918 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322300 4918 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322309 4918 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322318 4918 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322327 4918 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322335 4918 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322344 4918 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322356 4918 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322364 4918 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322372 4918 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322381 4918 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322389 4918 feature_gate.go:330] unrecognized feature gate: Example Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322398 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322406 4918 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322414 4918 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322423 4918 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322431 4918 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322439 4918 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322447 4918 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322456 4918 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322464 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322472 4918 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322480 4918 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322489 4918 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322497 4918 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322508 4918 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322516 4918 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322556 4918 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322565 4918 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322574 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322584 4918 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322593 4918 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322603 4918 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322612 4918 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322620 4918 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322629 4918 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322638 4918 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322647 4918 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322655 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322665 4918 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322673 4918 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322681 4918 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322689 4918 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322699 4918 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322708 4918 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322716 4918 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322724 4918 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322732 4918 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322741 4918 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322752 4918 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322762 4918 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322772 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.322781 4918 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.322801 4918 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323081 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323098 4918 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323107 4918 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323119 4918 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323129 4918 feature_gate.go:330] unrecognized feature gate: Example Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323137 4918 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323146 4918 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323154 4918 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323163 4918 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323174 4918 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323182 4918 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323191 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323199 4918 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323207 4918 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323216 4918 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323224 4918 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323233 4918 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323242 4918 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323250 4918 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323258 4918 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323269 4918 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323281 4918 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323290 4918 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323300 4918 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323310 4918 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323320 4918 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323329 4918 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323338 4918 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323347 4918 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323356 4918 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323368 4918 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323377 4918 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323388 4918 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323398 4918 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323407 4918 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323419 4918 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323430 4918 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323441 4918 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323453 4918 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323464 4918 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323474 4918 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323484 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323494 4918 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323503 4918 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323512 4918 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323550 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323560 4918 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323569 4918 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323577 4918 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323585 4918 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323593 4918 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323602 4918 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323610 4918 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323618 4918 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323627 4918 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323635 4918 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323643 4918 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323651 4918 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323660 4918 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323668 4918 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323676 4918 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323686 4918 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323694 4918 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323703 4918 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323711 4918 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323719 4918 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323728 4918 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323737 4918 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323745 4918 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323754 4918 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.323762 4918 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.323776 4918 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.324453 4918 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.335417 4918 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.340783 4918 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.341044 4918 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.343251 4918 server.go:997] "Starting client certificate rotation" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.343320 4918 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.343760 4918 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.373499 4918 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.375851 4918 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.376904 4918 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.394694 4918 log.go:25] "Validated CRI v1 runtime API" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.438628 4918 log.go:25] "Validated CRI v1 image API" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.441424 4918 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.447901 4918 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-19-16-35-28-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.448017 4918 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.480976 4918 manager.go:217] Machine: {Timestamp:2026-03-19 16:39:48.478438409 +0000 UTC m=+0.600637717 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:bb6fd883-4ea6-4b3c-be0c-dda5543e1953 BootID:d23eb629-4a93-4855-a806-6c791cece8cb Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d0:e3:b1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d0:e3:b1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:aa:7c:c7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ec:75:81 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:4f:1b:68 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7f:ac:ef Speed:-1 Mtu:1496} {Name:eth10 MacAddress:8a:99:a9:6c:c8:16 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:c0:f1:bf:ec:6f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.481393 4918 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.481765 4918 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.483747 4918 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.484062 4918 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.484117 4918 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.484488 4918 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.484509 4918 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.485058 4918 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.485093 4918 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.485502 4918 state_mem.go:36] "Initialized new in-memory state store" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.485666 4918 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.489786 4918 kubelet.go:418] "Attempting to sync node with API server" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.489821 4918 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.489867 4918 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.489890 4918 kubelet.go:324] "Adding apiserver pod source" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.490004 4918 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.496560 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.496718 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.496754 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.496957 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.497821 4918 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.499006 4918 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.501857 4918 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503658 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503700 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503715 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503728 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503749 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503762 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503776 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503799 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503815 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503829 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503847 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503861 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.503904 4918 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.504517 4918 server.go:1280] "Started kubelet" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.504762 4918 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.504851 4918 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.505098 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.505893 4918 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 16:39:48 crc systemd[1]: Started Kubernetes Kubelet. Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.508392 4918 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.508797 4918 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.509423 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.509462 4918 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.509500 4918 server.go:460] "Adding debug handlers to kubelet server" Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.510088 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.510183 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.509445 4918 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.510333 4918 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.511554 4918 factory.go:153] Registering CRI-O factory Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.511598 4918 factory.go:221] Registration of the crio container factory successfully Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.511706 4918 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.511723 4918 factory.go:55] Registering systemd factory Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.511737 4918 factory.go:221] Registration of the systemd container factory successfully Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.511767 4918 factory.go:103] Registering Raw factory Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.511791 4918 manager.go:1196] Started watching for new ooms in manager Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.512413 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="200ms" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.512922 4918 manager.go:319] Starting recovery of all containers Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.513677 4918 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.142:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e4b90cc3d203e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,LastTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.530787 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.530865 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.530889 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.530924 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.530946 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.530965 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.530985 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531007 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531030 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531048 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531067 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531084 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531104 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531125 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531145 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531164 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531182 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531203 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531225 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531244 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531264 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531281 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531300 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531426 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531446 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531465 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531514 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531561 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531579 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531598 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531621 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531641 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531661 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531680 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531728 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531748 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531769 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531788 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531806 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531827 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531846 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531874 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531894 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531913 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531933 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531952 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531973 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.531994 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532015 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532040 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532059 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532081 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532108 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532129 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532149 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532169 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532188 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532207 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532225 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532244 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532263 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532282 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532305 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532324 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532343 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532364 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532384 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532405 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532423 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532441 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532461 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532479 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532497 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532517 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532562 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532583 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532602 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532622 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532643 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532662 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532681 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532702 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532720 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532740 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532761 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532780 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532800 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532818 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532837 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532856 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532878 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532898 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532917 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532936 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.532986 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533006 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533024 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533043 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533062 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533080 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533100 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533119 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533141 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533175 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533204 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533227 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533317 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533339 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533362 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533384 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533405 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533428 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533449 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533468 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533489 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533508 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533608 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533630 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533649 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.533672 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.535864 4918 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.535950 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.535988 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536022 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536054 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536086 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536118 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536144 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536233 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536291 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536322 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536351 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536385 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536418 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536446 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536472 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536499 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536558 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536588 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536616 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536642 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536674 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536702 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536730 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536757 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536784 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536813 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536853 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536883 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.536911 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537089 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537131 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537162 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537194 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537224 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537252 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537283 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537311 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537345 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537374 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537406 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537434 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537461 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537490 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537520 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537591 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537622 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537651 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537679 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537709 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537737 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537767 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537793 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537822 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537853 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537884 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537912 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537940 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537967 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.537997 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538027 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538053 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538083 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538112 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538139 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538166 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538194 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538223 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538254 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538282 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538314 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538341 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538369 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538396 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538424 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538454 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538483 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538513 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538585 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538617 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538645 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538677 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538705 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538739 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538768 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538799 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538830 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538856 4918 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538881 4918 reconstruct.go:97] "Volume reconstruction finished" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.538898 4918 reconciler.go:26] "Reconciler: start to sync state" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.559518 4918 manager.go:324] Recovery completed Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.575029 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.578283 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.578351 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.578364 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.579442 4918 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.579675 4918 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.579815 4918 state_mem.go:36] "Initialized new in-memory state store" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.581926 4918 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.585100 4918 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.585150 4918 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.585180 4918 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.585234 4918 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 16:39:48 crc kubenswrapper[4918]: W0319 16:39:48.587676 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.587759 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.599051 4918 policy_none.go:49] "None policy: Start" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.600441 4918 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.600663 4918 state_mem.go:35] "Initializing new in-memory state store" Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.610476 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.671007 4918 manager.go:334] "Starting Device Plugin manager" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.671206 4918 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.671224 4918 server.go:79] "Starting device plugin registration server" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.671802 4918 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.671828 4918 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.672014 4918 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.672132 4918 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.672143 4918 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.684175 4918 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.685358 4918 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.685499 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.687054 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.687112 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.687132 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.687349 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.687681 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.687755 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.688916 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.688946 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.688967 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.689230 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.689252 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.689284 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.689493 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.689690 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.689748 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.692152 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.692211 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.692232 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.692634 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.693129 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.693213 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.695947 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.696037 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.696069 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.696085 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.696135 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.696163 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.696264 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.696407 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.696367 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.696461 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.696577 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.696653 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.697681 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.697737 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.697764 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.698051 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.698113 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.698846 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.698886 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.698900 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.699355 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.699389 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.699399 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.714355 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="400ms" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.740490 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.740729 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.740874 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.740908 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.740926 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.740945 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.740964 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.740980 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.740997 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.741015 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.741035 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.741051 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.741089 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.741109 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.741127 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.772878 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.774088 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.774127 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.774140 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.774177 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.774609 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842188 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842250 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842280 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842318 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842341 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842364 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842403 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842424 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842446 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842472 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842583 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842476 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842489 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842640 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842616 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842680 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842678 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842708 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842724 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842745 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842783 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842786 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842783 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842852 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842864 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842905 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842958 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842859 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842919 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.842919 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.975889 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.977563 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.977642 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.977663 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:48 crc kubenswrapper[4918]: I0319 16:39:48.977796 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:39:48 crc kubenswrapper[4918]: E0319 16:39:48.978585 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.023889 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.033300 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.048957 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.066273 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.071283 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 16:39:49 crc kubenswrapper[4918]: W0319 16:39:49.074502 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-b1f0b4d298823680f57376383a633a276460d3c1f191a47b2d98da88ac54782c WatchSource:0}: Error finding container b1f0b4d298823680f57376383a633a276460d3c1f191a47b2d98da88ac54782c: Status 404 returned error can't find the container with id b1f0b4d298823680f57376383a633a276460d3c1f191a47b2d98da88ac54782c Mar 19 16:39:49 crc kubenswrapper[4918]: W0319 16:39:49.077518 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-51b238ca8e5c191c34b2032b07a3fec47045bb58b948cc24b65dacc605ddf4a7 WatchSource:0}: Error finding container 51b238ca8e5c191c34b2032b07a3fec47045bb58b948cc24b65dacc605ddf4a7: Status 404 returned error can't find the container with id 51b238ca8e5c191c34b2032b07a3fec47045bb58b948cc24b65dacc605ddf4a7 Mar 19 16:39:49 crc kubenswrapper[4918]: W0319 16:39:49.084136 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-05281d724beace0c47924c606ff4b7c71b285ee3d753e79cb045d3fa5592109d WatchSource:0}: Error finding container 05281d724beace0c47924c606ff4b7c71b285ee3d753e79cb045d3fa5592109d: Status 404 returned error can't find the container with id 05281d724beace0c47924c606ff4b7c71b285ee3d753e79cb045d3fa5592109d Mar 19 16:39:49 crc kubenswrapper[4918]: E0319 16:39:49.115755 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="800ms" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.379718 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.381499 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.381574 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.381600 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.381634 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:39:49 crc kubenswrapper[4918]: E0319 16:39:49.382213 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.506272 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:49 crc kubenswrapper[4918]: W0319 16:39:49.531275 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:49 crc kubenswrapper[4918]: E0319 16:39:49.531394 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Mar 19 16:39:49 crc kubenswrapper[4918]: W0319 16:39:49.562849 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:49 crc kubenswrapper[4918]: E0319 16:39:49.562982 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.591310 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"05281d724beace0c47924c606ff4b7c71b285ee3d753e79cb045d3fa5592109d"} Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.592588 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"51b238ca8e5c191c34b2032b07a3fec47045bb58b948cc24b65dacc605ddf4a7"} Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.593571 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b1f0b4d298823680f57376383a633a276460d3c1f191a47b2d98da88ac54782c"} Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.595079 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3fc3fdd3a22791fcdee81b39c66c0191103d28b285d6e77bdd9171ee6950efc6"} Mar 19 16:39:49 crc kubenswrapper[4918]: I0319 16:39:49.596010 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e2377c4f7c3f343c5441cf73911e3afd24f31afde94848fdc2f56c7c020beae0"} Mar 19 16:39:49 crc kubenswrapper[4918]: W0319 16:39:49.664831 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:49 crc kubenswrapper[4918]: E0319 16:39:49.664905 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Mar 19 16:39:49 crc kubenswrapper[4918]: W0319 16:39:49.714657 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:49 crc kubenswrapper[4918]: E0319 16:39:49.714765 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Mar 19 16:39:49 crc kubenswrapper[4918]: E0319 16:39:49.917485 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="1.6s" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.182941 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.185057 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.185107 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.185123 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.185161 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:39:50 crc kubenswrapper[4918]: E0319 16:39:50.185765 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.444255 4918 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 16:39:50 crc kubenswrapper[4918]: E0319 16:39:50.445276 4918 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.506477 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.603410 4918 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e08c81dafb4f6fd484adfecb797de6fea29a9221096367160131599d49ef645d" exitCode=0 Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.603555 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e08c81dafb4f6fd484adfecb797de6fea29a9221096367160131599d49ef645d"} Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.603699 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.605256 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.605322 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.605348 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.609069 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"413c8f16e756c0b58fb92c037dc4fff63ab4a5a0ecaff2196b154a805ca845e5"} Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.609129 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"105ce792c8b8b438473cea6cfc9963dd4146f531d2859ab5450be268a4579861"} Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.609145 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c6437dfc5300d53cfb94e20836c084662dd1c61262baadd3d3b68a579df4716d"} Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.609161 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2abd1b66985995e1a44fc90e80d1a0acca10e7d483e7aca71531747026fb6a2e"} Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.609229 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.610900 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.610975 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.610995 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.612281 4918 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1e2f2a64e0a28939dcbaf1f7b37560e2b063c803597656de25ba5298c6ecdcf8" exitCode=0 Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.612398 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1e2f2a64e0a28939dcbaf1f7b37560e2b063c803597656de25ba5298c6ecdcf8"} Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.612977 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.616873 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.616914 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.616933 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.618726 4918 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ae959ea338a301713af4344c7eef48f5d2562204a83bce2ffee8e51c6bcda4cd" exitCode=0 Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.618957 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.618957 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ae959ea338a301713af4344c7eef48f5d2562204a83bce2ffee8e51c6bcda4cd"} Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.620619 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.620785 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.620912 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.622501 4918 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5d6a05ce01d809e56555524e2fb826f60775b74420f54baece0f3fdb247d2c0f" exitCode=0 Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.622594 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5d6a05ce01d809e56555524e2fb826f60775b74420f54baece0f3fdb247d2c0f"} Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.622727 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.624013 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.624074 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.624099 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.625320 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.627105 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.627143 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:50 crc kubenswrapper[4918]: I0319 16:39:50.627164 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.232500 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.506286 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:51 crc kubenswrapper[4918]: E0319 16:39:51.518496 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="3.2s" Mar 19 16:39:51 crc kubenswrapper[4918]: W0319 16:39:51.535005 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Mar 19 16:39:51 crc kubenswrapper[4918]: E0319 16:39:51.535109 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.629199 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d6d6ac1d34cd8b3ba7714409500ddbe798ade472614a192ad83b3f69208e33bd"} Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.629222 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.637181 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.637239 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.637251 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.641422 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"21e6deba6fa7f7eb1a99f282b89cd27741749d1f873466b70b61efcd3c8de134"} Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.641465 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6870b34d41485a7e986f1d814b1fdd0665ef579127add183891b9b06b2b43de7"} Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.641477 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0f721fa0390f3d5e43091976e13183fd17c6a14552b73172dfa4a3aef6aec8c7"} Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.641599 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.642669 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.642722 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.642734 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.645433 4918 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5a7d5b234378aa9673c950442d166da44a2983d419980887f3792719cf2e1074" exitCode=0 Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.645497 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5a7d5b234378aa9673c950442d166da44a2983d419980887f3792719cf2e1074"} Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.645636 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.646565 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.646595 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.646606 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.655375 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82"} Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.655386 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.655428 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d"} Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.655445 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0"} Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.655458 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94"} Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.656800 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.656854 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.656867 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.786147 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.787883 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.787929 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.787960 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.787991 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:39:51 crc kubenswrapper[4918]: E0319 16:39:51.788716 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.859826 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:51 crc kubenswrapper[4918]: I0319 16:39:51.867133 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.662761 4918 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0792b3273db8c5686ccf628459e25c13c1e31cb2da9e258c33697c1491ad02f6" exitCode=0 Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.662848 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0792b3273db8c5686ccf628459e25c13c1e31cb2da9e258c33697c1491ad02f6"} Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.663005 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.664657 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.664720 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.664740 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.667583 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dca2ab07b4ed822dc6ce0e4bfeb8b38dfd90c7c6f003a31d77a4d21651812792"} Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.667663 4918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.667735 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.667754 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.667803 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.667741 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.670221 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.670288 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.670320 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.670334 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.670376 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.670395 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.670443 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.670516 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.670597 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.671878 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.671934 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:52 crc kubenswrapper[4918]: I0319 16:39:52.671953 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.678220 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f2c64c2b7e8c260da6f446062fb517a524572b4408676602c04fcb015c30f6df"} Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.678285 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d008e44b5fabd637449b1ef1d21a95fe8e444b51c7f0726c57544d1c8ff77a1b"} Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.678303 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"72f3b38cb252e90a8d4f755c0f4765fd13cfb2c3fc6522543ca5da02b0bd13e3"} Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.678365 4918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.678372 4918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.678429 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.678451 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.679882 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.679936 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.679953 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.680585 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.680630 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:53 crc kubenswrapper[4918]: I0319 16:39:53.680649 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.233220 4918 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.233318 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.688178 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f5e7ea3b6d2de3aab9816ebf99aaa95c959f2a9f6fc89e09b586376e4faf9190"} Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.688290 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5050af2e89e6a209dbf593766d887d8c58a5d4f5c5346a0cbf8663bee5741fae"} Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.688256 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.689788 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.689844 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.689863 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.719886 4918 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.747157 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.747307 4918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.747356 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.748649 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.748681 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.748689 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.988950 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.990939 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.990989 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.991002 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:54 crc kubenswrapper[4918]: I0319 16:39:54.991034 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:39:55 crc kubenswrapper[4918]: I0319 16:39:55.690867 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:55 crc kubenswrapper[4918]: I0319 16:39:55.691763 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:55 crc kubenswrapper[4918]: I0319 16:39:55.691792 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:55 crc kubenswrapper[4918]: I0319 16:39:55.691802 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.259778 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.260021 4918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.260140 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.262071 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.262153 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.262175 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.897630 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.897887 4918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.897940 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.899535 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.899585 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:56 crc kubenswrapper[4918]: I0319 16:39:56.899598 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:57 crc kubenswrapper[4918]: I0319 16:39:57.382618 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:39:57 crc kubenswrapper[4918]: I0319 16:39:57.697423 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:57 crc kubenswrapper[4918]: I0319 16:39:57.699202 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:57 crc kubenswrapper[4918]: I0319 16:39:57.699424 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:57 crc kubenswrapper[4918]: I0319 16:39:57.699615 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:58 crc kubenswrapper[4918]: I0319 16:39:58.080657 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:39:58 crc kubenswrapper[4918]: I0319 16:39:58.080917 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:58 crc kubenswrapper[4918]: I0319 16:39:58.082283 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:58 crc kubenswrapper[4918]: I0319 16:39:58.082328 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:58 crc kubenswrapper[4918]: I0319 16:39:58.082344 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:58 crc kubenswrapper[4918]: E0319 16:39:58.684305 4918 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 16:39:58 crc kubenswrapper[4918]: I0319 16:39:58.944765 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 16:39:58 crc kubenswrapper[4918]: I0319 16:39:58.945090 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:58 crc kubenswrapper[4918]: I0319 16:39:58.946952 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:58 crc kubenswrapper[4918]: I0319 16:39:58.947015 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:58 crc kubenswrapper[4918]: I0319 16:39:58.947038 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:59 crc kubenswrapper[4918]: I0319 16:39:59.021638 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 19 16:39:59 crc kubenswrapper[4918]: I0319 16:39:59.021928 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:59 crc kubenswrapper[4918]: I0319 16:39:59.023369 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:59 crc kubenswrapper[4918]: I0319 16:39:59.023399 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:59 crc kubenswrapper[4918]: I0319 16:39:59.023409 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:39:59 crc kubenswrapper[4918]: I0319 16:39:59.782927 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 19 16:39:59 crc kubenswrapper[4918]: I0319 16:39:59.783386 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:39:59 crc kubenswrapper[4918]: I0319 16:39:59.785389 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:39:59 crc kubenswrapper[4918]: I0319 16:39:59.785451 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:39:59 crc kubenswrapper[4918]: I0319 16:39:59.785464 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:02 crc kubenswrapper[4918]: I0319 16:40:02.508085 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 19 16:40:02 crc kubenswrapper[4918]: W0319 16:40:02.530885 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 19 16:40:02 crc kubenswrapper[4918]: I0319 16:40:02.530974 4918 trace.go:236] Trace[1378377848]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 16:39:52.529) (total time: 10001ms): Mar 19 16:40:02 crc kubenswrapper[4918]: Trace[1378377848]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:40:02.530) Mar 19 16:40:02 crc kubenswrapper[4918]: Trace[1378377848]: [10.001377152s] [10.001377152s] END Mar 19 16:40:02 crc kubenswrapper[4918]: E0319 16:40:02.530995 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 19 16:40:02 crc kubenswrapper[4918]: W0319 16:40:02.545695 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 19 16:40:02 crc kubenswrapper[4918]: I0319 16:40:02.545827 4918 trace.go:236] Trace[1949260155]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 16:39:52.543) (total time: 10002ms): Mar 19 16:40:02 crc kubenswrapper[4918]: Trace[1949260155]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (16:40:02.545) Mar 19 16:40:02 crc kubenswrapper[4918]: Trace[1949260155]: [10.002283571s] [10.002283571s] END Mar 19 16:40:02 crc kubenswrapper[4918]: E0319 16:40:02.545860 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 19 16:40:02 crc kubenswrapper[4918]: W0319 16:40:02.790877 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 19 16:40:02 crc kubenswrapper[4918]: I0319 16:40:02.790996 4918 trace.go:236] Trace[1460711353]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 16:39:52.789) (total time: 10001ms): Mar 19 16:40:02 crc kubenswrapper[4918]: Trace[1460711353]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:40:02.790) Mar 19 16:40:02 crc kubenswrapper[4918]: Trace[1460711353]: [10.001718068s] [10.001718068s] END Mar 19 16:40:02 crc kubenswrapper[4918]: E0319 16:40:02.791023 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 19 16:40:03 crc kubenswrapper[4918]: E0319 16:40:03.549409 4918 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:03Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e4b90cc3d203e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,LastTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:03 crc kubenswrapper[4918]: W0319 16:40:03.551168 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:03Z is after 2026-02-23T05:33:13Z Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.551176 4918 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 16:40:03 crc kubenswrapper[4918]: E0319 16:40:03.551260 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:03 crc kubenswrapper[4918]: E0319 16:40:03.551185 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:03Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.551571 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 16:40:03 crc kubenswrapper[4918]: E0319 16:40:03.552743 4918 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:03 crc kubenswrapper[4918]: E0319 16:40:03.552940 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:03Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.555561 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:03Z is after 2026-02-23T05:33:13Z Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.559398 4918 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.559511 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.716376 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.717972 4918 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dca2ab07b4ed822dc6ce0e4bfeb8b38dfd90c7c6f003a31d77a4d21651812792" exitCode=255 Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.718012 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dca2ab07b4ed822dc6ce0e4bfeb8b38dfd90c7c6f003a31d77a4d21651812792"} Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.718157 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.719097 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.719127 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.719138 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:03 crc kubenswrapper[4918]: I0319 16:40:03.719615 4918 scope.go:117] "RemoveContainer" containerID="dca2ab07b4ed822dc6ce0e4bfeb8b38dfd90c7c6f003a31d77a4d21651812792" Mar 19 16:40:04 crc kubenswrapper[4918]: I0319 16:40:04.233167 4918 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 16:40:04 crc kubenswrapper[4918]: I0319 16:40:04.233276 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 16:40:04 crc kubenswrapper[4918]: I0319 16:40:04.510149 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:04Z is after 2026-02-23T05:33:13Z Mar 19 16:40:04 crc kubenswrapper[4918]: I0319 16:40:04.723258 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 16:40:04 crc kubenswrapper[4918]: I0319 16:40:04.730153 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6e58d0e4d9b7a2d71f23a7211f4740fbc7291cf68471b8d0df17bb2671178bd9"} Mar 19 16:40:04 crc kubenswrapper[4918]: I0319 16:40:04.730380 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:04 crc kubenswrapper[4918]: I0319 16:40:04.731505 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:04 crc kubenswrapper[4918]: I0319 16:40:04.731561 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:04 crc kubenswrapper[4918]: I0319 16:40:04.731574 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:04 crc kubenswrapper[4918]: I0319 16:40:04.753469 4918 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]log ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]etcd ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/priority-and-fairness-filter ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/start-apiextensions-informers ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/start-apiextensions-controllers ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/crd-informer-synced ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/start-system-namespaces-controller ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 19 16:40:04 crc kubenswrapper[4918]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/bootstrap-controller ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/start-kube-aggregator-informers ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/apiservice-registration-controller ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/apiservice-discovery-controller ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]autoregister-completion ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/apiservice-openapi-controller ok Mar 19 16:40:04 crc kubenswrapper[4918]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 19 16:40:04 crc kubenswrapper[4918]: livez check failed Mar 19 16:40:04 crc kubenswrapper[4918]: I0319 16:40:04.753601 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:40:05 crc kubenswrapper[4918]: I0319 16:40:05.510590 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:05Z is after 2026-02-23T05:33:13Z Mar 19 16:40:05 crc kubenswrapper[4918]: I0319 16:40:05.735376 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 16:40:05 crc kubenswrapper[4918]: I0319 16:40:05.736432 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 16:40:05 crc kubenswrapper[4918]: I0319 16:40:05.738594 4918 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6e58d0e4d9b7a2d71f23a7211f4740fbc7291cf68471b8d0df17bb2671178bd9" exitCode=255 Mar 19 16:40:05 crc kubenswrapper[4918]: I0319 16:40:05.738638 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6e58d0e4d9b7a2d71f23a7211f4740fbc7291cf68471b8d0df17bb2671178bd9"} Mar 19 16:40:05 crc kubenswrapper[4918]: I0319 16:40:05.738675 4918 scope.go:117] "RemoveContainer" containerID="dca2ab07b4ed822dc6ce0e4bfeb8b38dfd90c7c6f003a31d77a4d21651812792" Mar 19 16:40:05 crc kubenswrapper[4918]: I0319 16:40:05.738976 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:05 crc kubenswrapper[4918]: I0319 16:40:05.740202 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:05 crc kubenswrapper[4918]: I0319 16:40:05.740266 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:05 crc kubenswrapper[4918]: I0319 16:40:05.740292 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:05 crc kubenswrapper[4918]: I0319 16:40:05.741603 4918 scope.go:117] "RemoveContainer" containerID="6e58d0e4d9b7a2d71f23a7211f4740fbc7291cf68471b8d0df17bb2671178bd9" Mar 19 16:40:05 crc kubenswrapper[4918]: E0319 16:40:05.742003 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:40:06 crc kubenswrapper[4918]: I0319 16:40:06.511702 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:06Z is after 2026-02-23T05:33:13Z Mar 19 16:40:06 crc kubenswrapper[4918]: I0319 16:40:06.696482 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:40:06 crc kubenswrapper[4918]: I0319 16:40:06.743669 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 16:40:06 crc kubenswrapper[4918]: I0319 16:40:06.745870 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:06 crc kubenswrapper[4918]: I0319 16:40:06.746915 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:06 crc kubenswrapper[4918]: I0319 16:40:06.746953 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:06 crc kubenswrapper[4918]: I0319 16:40:06.746968 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:06 crc kubenswrapper[4918]: I0319 16:40:06.747569 4918 scope.go:117] "RemoveContainer" containerID="6e58d0e4d9b7a2d71f23a7211f4740fbc7291cf68471b8d0df17bb2671178bd9" Mar 19 16:40:06 crc kubenswrapper[4918]: E0319 16:40:06.748254 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:40:07 crc kubenswrapper[4918]: I0319 16:40:07.382867 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:40:07 crc kubenswrapper[4918]: W0319 16:40:07.498126 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:07Z is after 2026-02-23T05:33:13Z Mar 19 16:40:07 crc kubenswrapper[4918]: E0319 16:40:07.498249 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:07 crc kubenswrapper[4918]: I0319 16:40:07.510599 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:07Z is after 2026-02-23T05:33:13Z Mar 19 16:40:07 crc kubenswrapper[4918]: W0319 16:40:07.586856 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:07Z is after 2026-02-23T05:33:13Z Mar 19 16:40:07 crc kubenswrapper[4918]: E0319 16:40:07.586979 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:07 crc kubenswrapper[4918]: W0319 16:40:07.632420 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:07Z is after 2026-02-23T05:33:13Z Mar 19 16:40:07 crc kubenswrapper[4918]: E0319 16:40:07.632508 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:07 crc kubenswrapper[4918]: I0319 16:40:07.749802 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:07 crc kubenswrapper[4918]: I0319 16:40:07.751041 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:07 crc kubenswrapper[4918]: I0319 16:40:07.751073 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:07 crc kubenswrapper[4918]: I0319 16:40:07.751086 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:07 crc kubenswrapper[4918]: I0319 16:40:07.751808 4918 scope.go:117] "RemoveContainer" containerID="6e58d0e4d9b7a2d71f23a7211f4740fbc7291cf68471b8d0df17bb2671178bd9" Mar 19 16:40:07 crc kubenswrapper[4918]: E0319 16:40:07.752014 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:40:08 crc kubenswrapper[4918]: I0319 16:40:08.085123 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:40:08 crc kubenswrapper[4918]: I0319 16:40:08.085294 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:08 crc kubenswrapper[4918]: I0319 16:40:08.086785 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:08 crc kubenswrapper[4918]: I0319 16:40:08.086829 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:08 crc kubenswrapper[4918]: I0319 16:40:08.086840 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:08 crc kubenswrapper[4918]: I0319 16:40:08.509680 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:08Z is after 2026-02-23T05:33:13Z Mar 19 16:40:08 crc kubenswrapper[4918]: E0319 16:40:08.684578 4918 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.511328 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:09Z is after 2026-02-23T05:33:13Z Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.753678 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.754344 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.755735 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.755794 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.755817 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.756772 4918 scope.go:117] "RemoveContainer" containerID="6e58d0e4d9b7a2d71f23a7211f4740fbc7291cf68471b8d0df17bb2671178bd9" Mar 19 16:40:09 crc kubenswrapper[4918]: E0319 16:40:09.757116 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.761960 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.820318 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.820773 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.823030 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.823100 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.823125 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.842785 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.953272 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.955459 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.955572 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.955594 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:09 crc kubenswrapper[4918]: I0319 16:40:09.955649 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:40:09 crc kubenswrapper[4918]: E0319 16:40:09.956156 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:09Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 16:40:09 crc kubenswrapper[4918]: E0319 16:40:09.958617 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:09Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 16:40:10 crc kubenswrapper[4918]: I0319 16:40:10.511624 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:10Z is after 2026-02-23T05:33:13Z Mar 19 16:40:10 crc kubenswrapper[4918]: I0319 16:40:10.759006 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:10 crc kubenswrapper[4918]: I0319 16:40:10.759041 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:10 crc kubenswrapper[4918]: I0319 16:40:10.760685 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:10 crc kubenswrapper[4918]: I0319 16:40:10.760719 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:10 crc kubenswrapper[4918]: I0319 16:40:10.760731 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:10 crc kubenswrapper[4918]: I0319 16:40:10.760864 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:10 crc kubenswrapper[4918]: I0319 16:40:10.760938 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:10 crc kubenswrapper[4918]: I0319 16:40:10.760969 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:10 crc kubenswrapper[4918]: I0319 16:40:10.761402 4918 scope.go:117] "RemoveContainer" containerID="6e58d0e4d9b7a2d71f23a7211f4740fbc7291cf68471b8d0df17bb2671178bd9" Mar 19 16:40:10 crc kubenswrapper[4918]: E0319 16:40:10.761696 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:40:11 crc kubenswrapper[4918]: I0319 16:40:11.509762 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:11Z is after 2026-02-23T05:33:13Z Mar 19 16:40:12 crc kubenswrapper[4918]: I0319 16:40:12.176734 4918 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 16:40:12 crc kubenswrapper[4918]: E0319 16:40:12.183071 4918 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:12 crc kubenswrapper[4918]: I0319 16:40:12.511924 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:12Z is after 2026-02-23T05:33:13Z Mar 19 16:40:13 crc kubenswrapper[4918]: I0319 16:40:13.509570 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:13Z is after 2026-02-23T05:33:13Z Mar 19 16:40:13 crc kubenswrapper[4918]: E0319 16:40:13.556087 4918 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:13Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e4b90cc3d203e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,LastTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.232996 4918 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.233098 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.233169 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.233339 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.235688 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.235765 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.235784 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.237106 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"c6437dfc5300d53cfb94e20836c084662dd1c61262baadd3d3b68a579df4716d"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.238601 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://c6437dfc5300d53cfb94e20836c084662dd1c61262baadd3d3b68a579df4716d" gracePeriod=30 Mar 19 16:40:14 crc kubenswrapper[4918]: W0319 16:40:14.325555 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:14Z is after 2026-02-23T05:33:13Z Mar 19 16:40:14 crc kubenswrapper[4918]: E0319 16:40:14.325695 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.511361 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:14Z is after 2026-02-23T05:33:13Z Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.776220 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.776846 4918 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c6437dfc5300d53cfb94e20836c084662dd1c61262baadd3d3b68a579df4716d" exitCode=255 Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.776918 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c6437dfc5300d53cfb94e20836c084662dd1c61262baadd3d3b68a579df4716d"} Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.776978 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c7a59de25910b581971d00381825fdbb84aba3dde406295f7416297c1a1e30d9"} Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.777143 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.778680 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.778747 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:14 crc kubenswrapper[4918]: I0319 16:40:14.778772 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:15 crc kubenswrapper[4918]: W0319 16:40:15.439958 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:15Z is after 2026-02-23T05:33:13Z Mar 19 16:40:15 crc kubenswrapper[4918]: E0319 16:40:15.440074 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:15 crc kubenswrapper[4918]: I0319 16:40:15.510583 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:15Z is after 2026-02-23T05:33:13Z Mar 19 16:40:15 crc kubenswrapper[4918]: W0319 16:40:15.566253 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:15Z is after 2026-02-23T05:33:13Z Mar 19 16:40:15 crc kubenswrapper[4918]: E0319 16:40:15.566333 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:16 crc kubenswrapper[4918]: I0319 16:40:16.260505 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:40:16 crc kubenswrapper[4918]: I0319 16:40:16.260909 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:16 crc kubenswrapper[4918]: I0319 16:40:16.262939 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:16 crc kubenswrapper[4918]: I0319 16:40:16.263006 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:16 crc kubenswrapper[4918]: I0319 16:40:16.263033 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:16 crc kubenswrapper[4918]: I0319 16:40:16.511407 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:16Z is after 2026-02-23T05:33:13Z Mar 19 16:40:16 crc kubenswrapper[4918]: I0319 16:40:16.959713 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:16 crc kubenswrapper[4918]: I0319 16:40:16.961687 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:16 crc kubenswrapper[4918]: I0319 16:40:16.961752 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:16 crc kubenswrapper[4918]: I0319 16:40:16.961772 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:16 crc kubenswrapper[4918]: I0319 16:40:16.961807 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:40:16 crc kubenswrapper[4918]: E0319 16:40:16.964626 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:16Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 16:40:16 crc kubenswrapper[4918]: E0319 16:40:16.968201 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:16Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 16:40:17 crc kubenswrapper[4918]: I0319 16:40:17.509495 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:17Z is after 2026-02-23T05:33:13Z Mar 19 16:40:18 crc kubenswrapper[4918]: I0319 16:40:18.511583 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:18Z is after 2026-02-23T05:33:13Z Mar 19 16:40:18 crc kubenswrapper[4918]: E0319 16:40:18.684828 4918 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 16:40:18 crc kubenswrapper[4918]: W0319 16:40:18.983410 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:18Z is after 2026-02-23T05:33:13Z Mar 19 16:40:18 crc kubenswrapper[4918]: E0319 16:40:18.983604 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:19 crc kubenswrapper[4918]: I0319 16:40:19.511716 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:19Z is after 2026-02-23T05:33:13Z Mar 19 16:40:20 crc kubenswrapper[4918]: I0319 16:40:20.511820 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:20Z is after 2026-02-23T05:33:13Z Mar 19 16:40:21 crc kubenswrapper[4918]: I0319 16:40:21.233614 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:40:21 crc kubenswrapper[4918]: I0319 16:40:21.233852 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:21 crc kubenswrapper[4918]: I0319 16:40:21.235609 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:21 crc kubenswrapper[4918]: I0319 16:40:21.235676 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:21 crc kubenswrapper[4918]: I0319 16:40:21.235700 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:21 crc kubenswrapper[4918]: I0319 16:40:21.510848 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:21Z is after 2026-02-23T05:33:13Z Mar 19 16:40:22 crc kubenswrapper[4918]: I0319 16:40:22.510776 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:22Z is after 2026-02-23T05:33:13Z Mar 19 16:40:23 crc kubenswrapper[4918]: I0319 16:40:23.511799 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:23Z is after 2026-02-23T05:33:13Z Mar 19 16:40:23 crc kubenswrapper[4918]: E0319 16:40:23.563716 4918 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:23Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e4b90cc3d203e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,LastTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:23 crc kubenswrapper[4918]: I0319 16:40:23.968334 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:23 crc kubenswrapper[4918]: E0319 16:40:23.969553 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:23Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 16:40:23 crc kubenswrapper[4918]: I0319 16:40:23.969944 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:23 crc kubenswrapper[4918]: I0319 16:40:23.970070 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:23 crc kubenswrapper[4918]: I0319 16:40:23.970180 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:23 crc kubenswrapper[4918]: I0319 16:40:23.970296 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:40:23 crc kubenswrapper[4918]: E0319 16:40:23.973076 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:23Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 16:40:24 crc kubenswrapper[4918]: I0319 16:40:24.233792 4918 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 16:40:24 crc kubenswrapper[4918]: I0319 16:40:24.234275 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 16:40:24 crc kubenswrapper[4918]: I0319 16:40:24.510185 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:24Z is after 2026-02-23T05:33:13Z Mar 19 16:40:25 crc kubenswrapper[4918]: I0319 16:40:25.511694 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:25Z is after 2026-02-23T05:33:13Z Mar 19 16:40:25 crc kubenswrapper[4918]: I0319 16:40:25.585623 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:25 crc kubenswrapper[4918]: I0319 16:40:25.587421 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:25 crc kubenswrapper[4918]: I0319 16:40:25.587501 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:25 crc kubenswrapper[4918]: I0319 16:40:25.587567 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:25 crc kubenswrapper[4918]: I0319 16:40:25.590238 4918 scope.go:117] "RemoveContainer" containerID="6e58d0e4d9b7a2d71f23a7211f4740fbc7291cf68471b8d0df17bb2671178bd9" Mar 19 16:40:26 crc kubenswrapper[4918]: I0319 16:40:26.510607 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:26Z is after 2026-02-23T05:33:13Z Mar 19 16:40:26 crc kubenswrapper[4918]: I0319 16:40:26.816462 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 16:40:26 crc kubenswrapper[4918]: I0319 16:40:26.816902 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 16:40:26 crc kubenswrapper[4918]: I0319 16:40:26.818439 4918 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="40b46a712b6dfddd02cfe82759e607cf2661f5fb6e28f4588b227e82ba4b14b2" exitCode=255 Mar 19 16:40:26 crc kubenswrapper[4918]: I0319 16:40:26.818490 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"40b46a712b6dfddd02cfe82759e607cf2661f5fb6e28f4588b227e82ba4b14b2"} Mar 19 16:40:26 crc kubenswrapper[4918]: I0319 16:40:26.818563 4918 scope.go:117] "RemoveContainer" containerID="6e58d0e4d9b7a2d71f23a7211f4740fbc7291cf68471b8d0df17bb2671178bd9" Mar 19 16:40:26 crc kubenswrapper[4918]: I0319 16:40:26.818797 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:26 crc kubenswrapper[4918]: I0319 16:40:26.820383 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:26 crc kubenswrapper[4918]: I0319 16:40:26.820426 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:26 crc kubenswrapper[4918]: I0319 16:40:26.820435 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:26 crc kubenswrapper[4918]: I0319 16:40:26.821072 4918 scope.go:117] "RemoveContainer" containerID="40b46a712b6dfddd02cfe82759e607cf2661f5fb6e28f4588b227e82ba4b14b2" Mar 19 16:40:26 crc kubenswrapper[4918]: E0319 16:40:26.821254 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:40:27 crc kubenswrapper[4918]: I0319 16:40:27.383097 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:40:27 crc kubenswrapper[4918]: I0319 16:40:27.508438 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:27Z is after 2026-02-23T05:33:13Z Mar 19 16:40:27 crc kubenswrapper[4918]: I0319 16:40:27.825304 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 16:40:27 crc kubenswrapper[4918]: I0319 16:40:27.829027 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:27 crc kubenswrapper[4918]: I0319 16:40:27.831176 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:27 crc kubenswrapper[4918]: I0319 16:40:27.831207 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:27 crc kubenswrapper[4918]: I0319 16:40:27.831220 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:27 crc kubenswrapper[4918]: I0319 16:40:27.831879 4918 scope.go:117] "RemoveContainer" containerID="40b46a712b6dfddd02cfe82759e607cf2661f5fb6e28f4588b227e82ba4b14b2" Mar 19 16:40:27 crc kubenswrapper[4918]: E0319 16:40:27.832156 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:40:28 crc kubenswrapper[4918]: I0319 16:40:28.509429 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:28Z is after 2026-02-23T05:33:13Z Mar 19 16:40:28 crc kubenswrapper[4918]: I0319 16:40:28.510564 4918 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 16:40:28 crc kubenswrapper[4918]: E0319 16:40:28.514945 4918 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:28 crc kubenswrapper[4918]: E0319 16:40:28.516190 4918 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 19 16:40:28 crc kubenswrapper[4918]: E0319 16:40:28.685078 4918 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 16:40:29 crc kubenswrapper[4918]: I0319 16:40:29.509702 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:29Z is after 2026-02-23T05:33:13Z Mar 19 16:40:30 crc kubenswrapper[4918]: I0319 16:40:30.511786 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:30Z is after 2026-02-23T05:33:13Z Mar 19 16:40:30 crc kubenswrapper[4918]: E0319 16:40:30.972975 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:30Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 16:40:30 crc kubenswrapper[4918]: I0319 16:40:30.974120 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:30 crc kubenswrapper[4918]: I0319 16:40:30.975380 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:30 crc kubenswrapper[4918]: I0319 16:40:30.975422 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:30 crc kubenswrapper[4918]: I0319 16:40:30.975434 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:30 crc kubenswrapper[4918]: I0319 16:40:30.975464 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:40:30 crc kubenswrapper[4918]: E0319 16:40:30.979920 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:30Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 16:40:31 crc kubenswrapper[4918]: I0319 16:40:31.512279 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:31Z is after 2026-02-23T05:33:13Z Mar 19 16:40:32 crc kubenswrapper[4918]: I0319 16:40:32.510475 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:32Z is after 2026-02-23T05:33:13Z Mar 19 16:40:33 crc kubenswrapper[4918]: W0319 16:40:33.337560 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:33Z is after 2026-02-23T05:33:13Z Mar 19 16:40:33 crc kubenswrapper[4918]: E0319 16:40:33.337688 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:33 crc kubenswrapper[4918]: I0319 16:40:33.510219 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:33Z is after 2026-02-23T05:33:13Z Mar 19 16:40:33 crc kubenswrapper[4918]: E0319 16:40:33.570038 4918 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:33Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e4b90cc3d203e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,LastTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:34 crc kubenswrapper[4918]: I0319 16:40:34.233615 4918 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 16:40:34 crc kubenswrapper[4918]: I0319 16:40:34.233735 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 16:40:34 crc kubenswrapper[4918]: I0319 16:40:34.511599 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:34Z is after 2026-02-23T05:33:13Z Mar 19 16:40:35 crc kubenswrapper[4918]: I0319 16:40:35.511166 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:35Z is after 2026-02-23T05:33:13Z Mar 19 16:40:36 crc kubenswrapper[4918]: I0319 16:40:36.511980 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:36Z is after 2026-02-23T05:33:13Z Mar 19 16:40:36 crc kubenswrapper[4918]: I0319 16:40:36.693985 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:40:36 crc kubenswrapper[4918]: I0319 16:40:36.694240 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:36 crc kubenswrapper[4918]: I0319 16:40:36.698164 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:36 crc kubenswrapper[4918]: I0319 16:40:36.698248 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:36 crc kubenswrapper[4918]: I0319 16:40:36.698273 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:36 crc kubenswrapper[4918]: I0319 16:40:36.699978 4918 scope.go:117] "RemoveContainer" containerID="40b46a712b6dfddd02cfe82759e607cf2661f5fb6e28f4588b227e82ba4b14b2" Mar 19 16:40:36 crc kubenswrapper[4918]: E0319 16:40:36.700446 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:40:37 crc kubenswrapper[4918]: W0319 16:40:37.471181 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:37Z is after 2026-02-23T05:33:13Z Mar 19 16:40:37 crc kubenswrapper[4918]: E0319 16:40:37.471331 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:37 crc kubenswrapper[4918]: I0319 16:40:37.511889 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:37Z is after 2026-02-23T05:33:13Z Mar 19 16:40:37 crc kubenswrapper[4918]: E0319 16:40:37.978134 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:37Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 16:40:37 crc kubenswrapper[4918]: I0319 16:40:37.980241 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:37 crc kubenswrapper[4918]: I0319 16:40:37.981778 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:37 crc kubenswrapper[4918]: I0319 16:40:37.981830 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:37 crc kubenswrapper[4918]: I0319 16:40:37.981843 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:37 crc kubenswrapper[4918]: I0319 16:40:37.981882 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:40:37 crc kubenswrapper[4918]: E0319 16:40:37.985990 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:37Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 16:40:38 crc kubenswrapper[4918]: I0319 16:40:38.509266 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:38Z is after 2026-02-23T05:33:13Z Mar 19 16:40:38 crc kubenswrapper[4918]: E0319 16:40:38.685438 4918 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 16:40:38 crc kubenswrapper[4918]: I0319 16:40:38.949046 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 16:40:38 crc kubenswrapper[4918]: I0319 16:40:38.949278 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:38 crc kubenswrapper[4918]: I0319 16:40:38.950751 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:38 crc kubenswrapper[4918]: I0319 16:40:38.950802 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:38 crc kubenswrapper[4918]: I0319 16:40:38.950817 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:39 crc kubenswrapper[4918]: W0319 16:40:39.225387 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:39Z is after 2026-02-23T05:33:13Z Mar 19 16:40:39 crc kubenswrapper[4918]: E0319 16:40:39.225497 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:39 crc kubenswrapper[4918]: I0319 16:40:39.508828 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:39Z is after 2026-02-23T05:33:13Z Mar 19 16:40:40 crc kubenswrapper[4918]: I0319 16:40:40.511196 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:40Z is after 2026-02-23T05:33:13Z Mar 19 16:40:40 crc kubenswrapper[4918]: W0319 16:40:40.713574 4918 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:40Z is after 2026-02-23T05:33:13Z Mar 19 16:40:40 crc kubenswrapper[4918]: E0319 16:40:40.713691 4918 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 16:40:41 crc kubenswrapper[4918]: I0319 16:40:41.511076 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:41Z is after 2026-02-23T05:33:13Z Mar 19 16:40:42 crc kubenswrapper[4918]: I0319 16:40:42.511043 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:42Z is after 2026-02-23T05:33:13Z Mar 19 16:40:43 crc kubenswrapper[4918]: I0319 16:40:43.511346 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:43Z is after 2026-02-23T05:33:13Z Mar 19 16:40:43 crc kubenswrapper[4918]: E0319 16:40:43.575760 4918 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:40:43Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e4b90cc3d203e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,LastTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.234209 4918 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.234364 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.234460 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.234709 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.236440 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.236508 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.236563 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.237470 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"c7a59de25910b581971d00381825fdbb84aba3dde406295f7416297c1a1e30d9"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.237677 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://c7a59de25910b581971d00381825fdbb84aba3dde406295f7416297c1a1e30d9" gracePeriod=30 Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.510084 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.882984 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.885405 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.885937 4918 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c7a59de25910b581971d00381825fdbb84aba3dde406295f7416297c1a1e30d9" exitCode=255 Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.885989 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c7a59de25910b581971d00381825fdbb84aba3dde406295f7416297c1a1e30d9"} Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.886061 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8c3945fc8bb3e5ae9ff01e5ba7a52f19bcc30a4d5b79641e48895bc8ea1acc0c"} Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.886094 4918 scope.go:117] "RemoveContainer" containerID="c6437dfc5300d53cfb94e20836c084662dd1c61262baadd3d3b68a579df4716d" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.886285 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.887313 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.887358 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.887367 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.986195 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:44 crc kubenswrapper[4918]: E0319 16:40:44.986355 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.988269 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.988350 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.988377 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:44 crc kubenswrapper[4918]: I0319 16:40:44.988424 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:40:44 crc kubenswrapper[4918]: E0319 16:40:44.996407 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 16:40:45 crc kubenswrapper[4918]: I0319 16:40:45.513488 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:45 crc kubenswrapper[4918]: I0319 16:40:45.892692 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 16:40:46 crc kubenswrapper[4918]: I0319 16:40:46.260382 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:40:46 crc kubenswrapper[4918]: I0319 16:40:46.260563 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:46 crc kubenswrapper[4918]: I0319 16:40:46.262274 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:46 crc kubenswrapper[4918]: I0319 16:40:46.262314 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:46 crc kubenswrapper[4918]: I0319 16:40:46.262327 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:46 crc kubenswrapper[4918]: I0319 16:40:46.511691 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:47 crc kubenswrapper[4918]: I0319 16:40:47.513933 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:48 crc kubenswrapper[4918]: I0319 16:40:48.514571 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:48 crc kubenswrapper[4918]: E0319 16:40:48.686620 4918 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.513766 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.585848 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.588705 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.588788 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.588815 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.590158 4918 scope.go:117] "RemoveContainer" containerID="40b46a712b6dfddd02cfe82759e607cf2661f5fb6e28f4588b227e82ba4b14b2" Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.909003 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.911630 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f52a359a1ac292a614b20c79a490412ec1b7e37ecf7dfc7576babdc09dfe0ea2"} Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.912053 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.913278 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.913424 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:49 crc kubenswrapper[4918]: I0319 16:40:49.913603 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:50 crc kubenswrapper[4918]: I0319 16:40:50.513397 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:50 crc kubenswrapper[4918]: I0319 16:40:50.918065 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 16:40:50 crc kubenswrapper[4918]: I0319 16:40:50.919619 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 16:40:50 crc kubenswrapper[4918]: I0319 16:40:50.923641 4918 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f52a359a1ac292a614b20c79a490412ec1b7e37ecf7dfc7576babdc09dfe0ea2" exitCode=255 Mar 19 16:40:50 crc kubenswrapper[4918]: I0319 16:40:50.923767 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f52a359a1ac292a614b20c79a490412ec1b7e37ecf7dfc7576babdc09dfe0ea2"} Mar 19 16:40:50 crc kubenswrapper[4918]: I0319 16:40:50.924097 4918 scope.go:117] "RemoveContainer" containerID="40b46a712b6dfddd02cfe82759e607cf2661f5fb6e28f4588b227e82ba4b14b2" Mar 19 16:40:50 crc kubenswrapper[4918]: I0319 16:40:50.924326 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:50 crc kubenswrapper[4918]: I0319 16:40:50.926139 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:50 crc kubenswrapper[4918]: I0319 16:40:50.926416 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:50 crc kubenswrapper[4918]: I0319 16:40:50.926922 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:50 crc kubenswrapper[4918]: I0319 16:40:50.934239 4918 scope.go:117] "RemoveContainer" containerID="f52a359a1ac292a614b20c79a490412ec1b7e37ecf7dfc7576babdc09dfe0ea2" Mar 19 16:40:50 crc kubenswrapper[4918]: E0319 16:40:50.935090 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.233184 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.233497 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.234614 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.234650 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.234659 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.512755 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.929363 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 16:40:51 crc kubenswrapper[4918]: E0319 16:40:51.994250 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.996546 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.998152 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.998222 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.998241 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:51 crc kubenswrapper[4918]: I0319 16:40:51.998303 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:40:52 crc kubenswrapper[4918]: E0319 16:40:52.004589 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 16:40:52 crc kubenswrapper[4918]: I0319 16:40:52.513258 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:53 crc kubenswrapper[4918]: I0319 16:40:53.511330 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.583982 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90cc3d203e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,LastTimestamp:2026-03-19 16:39:48.50448595 +0000 UTC m=+0.626685228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.590828 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a3aab0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578314928 +0000 UTC m=+0.700514186,LastTimestamp:2026-03-19 16:39:48.578314928 +0000 UTC m=+0.700514186,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.596750 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a4597f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578359679 +0000 UTC m=+0.700558937,LastTimestamp:2026-03-19 16:39:48.578359679 +0000 UTC m=+0.700558937,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.605879 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a48342 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.57837037 +0000 UTC m=+0.700569628,LastTimestamp:2026-03-19 16:39:48.57837037 +0000 UTC m=+0.700569628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.613911 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d664c30c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.674855692 +0000 UTC m=+0.797054970,LastTimestamp:2026-03-19 16:39:48.674855692 +0000 UTC m=+0.797054970,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.620913 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a3aab0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a3aab0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578314928 +0000 UTC m=+0.700514186,LastTimestamp:2026-03-19 16:39:48.687089709 +0000 UTC m=+0.809288997,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.627851 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a4597f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a4597f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578359679 +0000 UTC m=+0.700558937,LastTimestamp:2026-03-19 16:39:48.68712464 +0000 UTC m=+0.809323928,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.632659 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a48342\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a48342 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.57837037 +0000 UTC m=+0.700569628,LastTimestamp:2026-03-19 16:39:48.687143501 +0000 UTC m=+0.809342789,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.638290 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a3aab0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a3aab0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578314928 +0000 UTC m=+0.700514186,LastTimestamp:2026-03-19 16:39:48.688936752 +0000 UTC m=+0.811136040,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.642438 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a4597f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a4597f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578359679 +0000 UTC m=+0.700558937,LastTimestamp:2026-03-19 16:39:48.688958813 +0000 UTC m=+0.811158101,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.648065 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a48342\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a48342 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.57837037 +0000 UTC m=+0.700569628,LastTimestamp:2026-03-19 16:39:48.688979774 +0000 UTC m=+0.811179062,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.653233 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a3aab0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a3aab0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578314928 +0000 UTC m=+0.700514186,LastTimestamp:2026-03-19 16:39:48.689245761 +0000 UTC m=+0.811445019,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.658950 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a4597f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a4597f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578359679 +0000 UTC m=+0.700558937,LastTimestamp:2026-03-19 16:39:48.689259731 +0000 UTC m=+0.811458989,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.664260 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a48342\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a48342 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.57837037 +0000 UTC m=+0.700569628,LastTimestamp:2026-03-19 16:39:48.689291182 +0000 UTC m=+0.811490440,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.669901 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a3aab0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a3aab0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578314928 +0000 UTC m=+0.700514186,LastTimestamp:2026-03-19 16:39:48.692197025 +0000 UTC m=+0.814396303,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.672118 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a4597f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a4597f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578359679 +0000 UTC m=+0.700558937,LastTimestamp:2026-03-19 16:39:48.692224015 +0000 UTC m=+0.814423293,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.679208 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a48342\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a48342 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.57837037 +0000 UTC m=+0.700569628,LastTimestamp:2026-03-19 16:39:48.692241506 +0000 UTC m=+0.814440794,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.683969 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a3aab0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a3aab0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578314928 +0000 UTC m=+0.700514186,LastTimestamp:2026-03-19 16:39:48.695986253 +0000 UTC m=+0.818185541,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.688487 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a4597f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a4597f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578359679 +0000 UTC m=+0.700558937,LastTimestamp:2026-03-19 16:39:48.696057845 +0000 UTC m=+0.818257143,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.693222 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a48342\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a48342 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.57837037 +0000 UTC m=+0.700569628,LastTimestamp:2026-03-19 16:39:48.696083855 +0000 UTC m=+0.818283153,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.698724 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a3aab0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a3aab0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578314928 +0000 UTC m=+0.700514186,LastTimestamp:2026-03-19 16:39:48.696120086 +0000 UTC m=+0.818319384,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.704403 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a4597f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a4597f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578359679 +0000 UTC m=+0.700558937,LastTimestamp:2026-03-19 16:39:48.696151097 +0000 UTC m=+0.818350395,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.709597 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a48342\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a48342 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.57837037 +0000 UTC m=+0.700569628,LastTimestamp:2026-03-19 16:39:48.696176418 +0000 UTC m=+0.818375706,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.716919 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a3aab0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a3aab0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578314928 +0000 UTC m=+0.700514186,LastTimestamp:2026-03-19 16:39:48.696391484 +0000 UTC m=+0.818590762,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.722360 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e4b90d0a4597f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e4b90d0a4597f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:48.578359679 +0000 UTC m=+0.700558937,LastTimestamp:2026-03-19 16:39:48.696420835 +0000 UTC m=+0.818620123,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.729188 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b90eeef126c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.086573164 +0000 UTC m=+1.208772422,LastTimestamp:2026-03-19 16:39:49.086573164 +0000 UTC m=+1.208772422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.734253 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b90eeef4a48 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.086587464 +0000 UTC m=+1.208786752,LastTimestamp:2026-03-19 16:39:49.086587464 +0000 UTC m=+1.208786752,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.740347 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e4b90eef20d41 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.086768449 +0000 UTC m=+1.208967727,LastTimestamp:2026-03-19 16:39:49.086768449 +0000 UTC m=+1.208967727,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.748722 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b90efc78e29 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.100760617 +0000 UTC m=+1.222959905,LastTimestamp:2026-03-19 16:39:49.100760617 +0000 UTC m=+1.222959905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.756871 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b90effe0dc4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.104332228 +0000 UTC m=+1.226531476,LastTimestamp:2026-03-19 16:39:49.104332228 +0000 UTC m=+1.226531476,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.761380 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b911322ef70 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.693951856 +0000 UTC m=+1.816151114,LastTimestamp:2026-03-19 16:39:49.693951856 +0000 UTC m=+1.816151114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.766338 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b911329412f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.694365999 +0000 UTC m=+1.816565257,LastTimestamp:2026-03-19 16:39:49.694365999 +0000 UTC m=+1.816565257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.771130 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b911329e626 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.69440823 +0000 UTC m=+1.816607488,LastTimestamp:2026-03-19 16:39:49.69440823 +0000 UTC m=+1.816607488,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.776047 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e4b911331ccb4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.694926004 +0000 UTC m=+1.817125272,LastTimestamp:2026-03-19 16:39:49.694926004 +0000 UTC m=+1.817125272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.784708 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b9113389db8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.695372728 +0000 UTC m=+1.817571966,LastTimestamp:2026-03-19 16:39:49.695372728 +0000 UTC m=+1.817571966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.792019 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b91149c4485 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.718680709 +0000 UTC m=+1.840879967,LastTimestamp:2026-03-19 16:39:49.718680709 +0000 UTC m=+1.840879967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.797781 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e4b9114ecafdd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.723951069 +0000 UTC m=+1.846150327,LastTimestamp:2026-03-19 16:39:49.723951069 +0000 UTC m=+1.846150327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.801632 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b9114ee3536 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.724050742 +0000 UTC m=+1.846250040,LastTimestamp:2026-03-19 16:39:49.724050742 +0000 UTC m=+1.846250040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.806478 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b9114f0e8d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.724227797 +0000 UTC m=+1.846427045,LastTimestamp:2026-03-19 16:39:49.724227797 +0000 UTC m=+1.846427045,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.810700 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b9114f496bc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.724468924 +0000 UTC m=+1.846668182,LastTimestamp:2026-03-19 16:39:49.724468924 +0000 UTC m=+1.846668182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.816251 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b911507e574 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.72573426 +0000 UTC m=+1.847933518,LastTimestamp:2026-03-19 16:39:49.72573426 +0000 UTC m=+1.847933518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.821047 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b91271af589 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.028973449 +0000 UTC m=+2.151172697,LastTimestamp:2026-03-19 16:39:50.028973449 +0000 UTC m=+2.151172697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.825253 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b9127d3a1b8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.041076152 +0000 UTC m=+2.163275440,LastTimestamp:2026-03-19 16:39:50.041076152 +0000 UTC m=+2.163275440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.830295 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b9127ea667b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.042568315 +0000 UTC m=+2.164767573,LastTimestamp:2026-03-19 16:39:50.042568315 +0000 UTC m=+2.164767573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.835072 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b91341baf3f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.247124799 +0000 UTC m=+2.369324087,LastTimestamp:2026-03-19 16:39:50.247124799 +0000 UTC m=+2.369324087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.839272 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b91350c3697 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.262888087 +0000 UTC m=+2.385087335,LastTimestamp:2026-03-19 16:39:50.262888087 +0000 UTC m=+2.385087335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.844925 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b91351a66eb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.263817963 +0000 UTC m=+2.386017201,LastTimestamp:2026-03-19 16:39:50.263817963 +0000 UTC m=+2.386017201,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.849349 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b9142d23b5c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.493969244 +0000 UTC m=+2.616168512,LastTimestamp:2026-03-19 16:39:50.493969244 +0000 UTC m=+2.616168512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.853919 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b914398b96a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.506977642 +0000 UTC m=+2.629176910,LastTimestamp:2026-03-19 16:39:50.506977642 +0000 UTC m=+2.629176910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.859940 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b91499f364d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.608066125 +0000 UTC m=+2.730265423,LastTimestamp:2026-03-19 16:39:50.608066125 +0000 UTC m=+2.730265423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.865188 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b914a41d4ac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.6187235 +0000 UTC m=+2.740922778,LastTimestamp:2026-03-19 16:39:50.6187235 +0000 UTC m=+2.740922778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.870823 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b914aa11955 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.624966997 +0000 UTC m=+2.747166295,LastTimestamp:2026-03-19 16:39:50.624966997 +0000 UTC m=+2.747166295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.875130 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e4b914aac7ff7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.625714167 +0000 UTC m=+2.747913445,LastTimestamp:2026-03-19 16:39:50.625714167 +0000 UTC m=+2.747913445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.876710 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b91592c3d39 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.868966713 +0000 UTC m=+2.991165961,LastTimestamp:2026-03-19 16:39:50.868966713 +0000 UTC m=+2.991165961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.879748 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b915988878e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.875015054 +0000 UTC m=+2.997214302,LastTimestamp:2026-03-19 16:39:50.875015054 +0000 UTC m=+2.997214302,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.883401 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b9159a87f55 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.877110101 +0000 UTC m=+2.999309339,LastTimestamp:2026-03-19 16:39:50.877110101 +0000 UTC m=+2.999309339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.888409 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e4b9159fb4243 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.882533955 +0000 UTC m=+3.004733203,LastTimestamp:2026-03-19 16:39:50.882533955 +0000 UTC m=+3.004733203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.892437 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b915a129b10 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.884064016 +0000 UTC m=+3.006263264,LastTimestamp:2026-03-19 16:39:50.884064016 +0000 UTC m=+3.006263264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.896476 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b915a2f2258 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.885933656 +0000 UTC m=+3.008132904,LastTimestamp:2026-03-19 16:39:50.885933656 +0000 UTC m=+3.008132904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.900988 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b915a631393 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.889337747 +0000 UTC m=+3.011536995,LastTimestamp:2026-03-19 16:39:50.889337747 +0000 UTC m=+3.011536995,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.906616 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b915a789d3d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.890749245 +0000 UTC m=+3.012948493,LastTimestamp:2026-03-19 16:39:50.890749245 +0000 UTC m=+3.012948493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.912957 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b915aa7e081 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.893846657 +0000 UTC m=+3.016045905,LastTimestamp:2026-03-19 16:39:50.893846657 +0000 UTC m=+3.016045905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.918478 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e4b915b41d502 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.903936258 +0000 UTC m=+3.026135506,LastTimestamp:2026-03-19 16:39:50.903936258 +0000 UTC m=+3.026135506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.924746 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b91657b0134 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.075455284 +0000 UTC m=+3.197654532,LastTimestamp:2026-03-19 16:39:51.075455284 +0000 UTC m=+3.197654532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.931302 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b91661e22e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.08614628 +0000 UTC m=+3.208345538,LastTimestamp:2026-03-19 16:39:51.08614628 +0000 UTC m=+3.208345538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.939101 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b9166562db2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.089819058 +0000 UTC m=+3.212018306,LastTimestamp:2026-03-19 16:39:51.089819058 +0000 UTC m=+3.212018306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.948301 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b91666f2213 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.091454483 +0000 UTC m=+3.213653731,LastTimestamp:2026-03-19 16:39:51.091454483 +0000 UTC m=+3.213653731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.955031 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b9167702c39 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.108299833 +0000 UTC m=+3.230499081,LastTimestamp:2026-03-19 16:39:51.108299833 +0000 UTC m=+3.230499081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.962010 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b9167a8a8ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.112001791 +0000 UTC m=+3.234201039,LastTimestamp:2026-03-19 16:39:51.112001791 +0000 UTC m=+3.234201039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.967613 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b9173002eca openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.30228705 +0000 UTC m=+3.424486298,LastTimestamp:2026-03-19 16:39:51.30228705 +0000 UTC m=+3.424486298,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.971810 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b9173017192 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.302369682 +0000 UTC m=+3.424568930,LastTimestamp:2026-03-19 16:39:51.302369682 +0000 UTC m=+3.424568930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.976628 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e4b9173f156ee openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.318091502 +0000 UTC m=+3.440290750,LastTimestamp:2026-03-19 16:39:51.318091502 +0000 UTC m=+3.440290750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.980224 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b9174008f35 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.319088949 +0000 UTC m=+3.441288197,LastTimestamp:2026-03-19 16:39:51.319088949 +0000 UTC m=+3.441288197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.983937 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b91741066fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.320127227 +0000 UTC m=+3.442326475,LastTimestamp:2026-03-19 16:39:51.320127227 +0000 UTC m=+3.442326475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.993041 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b917f209fbf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.505739711 +0000 UTC m=+3.627938959,LastTimestamp:2026-03-19 16:39:51.505739711 +0000 UTC m=+3.627938959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:53 crc kubenswrapper[4918]: E0319 16:40:53.999335 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b918019327f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.522030207 +0000 UTC m=+3.644229455,LastTimestamp:2026-03-19 16:39:51.522030207 +0000 UTC m=+3.644229455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.006153 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b9180328003 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.523688451 +0000 UTC m=+3.645887709,LastTimestamp:2026-03-19 16:39:51.523688451 +0000 UTC m=+3.645887709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.012922 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b9187a0aa0c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.648348684 +0000 UTC m=+3.770547942,LastTimestamp:2026-03-19 16:39:51.648348684 +0000 UTC m=+3.770547942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.018926 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b918ceb476f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.737124719 +0000 UTC m=+3.859323967,LastTimestamp:2026-03-19 16:39:51.737124719 +0000 UTC m=+3.859323967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.024497 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b918da632f2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.749374706 +0000 UTC m=+3.871573954,LastTimestamp:2026-03-19 16:39:51.749374706 +0000 UTC m=+3.871573954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.029316 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91934a0a1d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.843998237 +0000 UTC m=+3.966197495,LastTimestamp:2026-03-19 16:39:51.843998237 +0000 UTC m=+3.966197495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.034220 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91944dfd8e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.861034382 +0000 UTC m=+3.983233630,LastTimestamp:2026-03-19 16:39:51.861034382 +0000 UTC m=+3.983233630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.038959 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91c456e2b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:52.666923704 +0000 UTC m=+4.789122962,LastTimestamp:2026-03-19 16:39:52.666923704 +0000 UTC m=+4.789122962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.044808 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91d2c6c31f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:52.909136671 +0000 UTC m=+5.031335929,LastTimestamp:2026-03-19 16:39:52.909136671 +0000 UTC m=+5.031335929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: I0319 16:40:54.050900 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:40:54 crc kubenswrapper[4918]: I0319 16:40:54.051130 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.052142 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91d38df4d4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:52.92219106 +0000 UTC m=+5.044390308,LastTimestamp:2026-03-19 16:39:52.92219106 +0000 UTC m=+5.044390308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: I0319 16:40:54.053104 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:54 crc kubenswrapper[4918]: I0319 16:40:54.053178 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:54 crc kubenswrapper[4918]: I0319 16:40:54.053459 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.057832 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91d3a47ccc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:52.92366766 +0000 UTC m=+5.045866908,LastTimestamp:2026-03-19 16:39:52.92366766 +0000 UTC m=+5.045866908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: I0319 16:40:54.062854 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.064419 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91e14caac7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:53.152793287 +0000 UTC m=+5.274992555,LastTimestamp:2026-03-19 16:39:53.152793287 +0000 UTC m=+5.274992555,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.068834 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91e21dc3ca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:53.166496714 +0000 UTC m=+5.288695972,LastTimestamp:2026-03-19 16:39:53.166496714 +0000 UTC m=+5.288695972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.073166 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91e239c3d5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:53.168331733 +0000 UTC m=+5.290531021,LastTimestamp:2026-03-19 16:39:53.168331733 +0000 UTC m=+5.290531021,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.078002 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91f0b52e10 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:53.41130088 +0000 UTC m=+5.533500138,LastTimestamp:2026-03-19 16:39:53.41130088 +0000 UTC m=+5.533500138,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.083210 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91f18d403d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:53.425461309 +0000 UTC m=+5.547660577,LastTimestamp:2026-03-19 16:39:53.425461309 +0000 UTC m=+5.547660577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.087483 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b91f1a06eea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:53.426718442 +0000 UTC m=+5.548917700,LastTimestamp:2026-03-19 16:39:53.426718442 +0000 UTC m=+5.548917700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.090015 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b92017e76f4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:53.692927732 +0000 UTC m=+5.815126990,LastTimestamp:2026-03-19 16:39:53.692927732 +0000 UTC m=+5.815126990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.096219 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b92022c0804 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:53.704302596 +0000 UTC m=+5.826501854,LastTimestamp:2026-03-19 16:39:53.704302596 +0000 UTC m=+5.826501854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.102964 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b9202422dd1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:53.705754065 +0000 UTC m=+5.827953323,LastTimestamp:2026-03-19 16:39:53.705754065 +0000 UTC m=+5.827953323,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.107901 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b920eb4e689 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:53.914599049 +0000 UTC m=+6.036798327,LastTimestamp:2026-03-19 16:39:53.914599049 +0000 UTC m=+6.036798327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.111888 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e4b920f659e9d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:53.926180509 +0000 UTC m=+6.048379757,LastTimestamp:2026-03-19 16:39:53.926180509 +0000 UTC m=+6.048379757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.116539 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 16:40:54 crc kubenswrapper[4918]: &Event{ObjectMeta:{kube-controller-manager-crc.189e4b9221b3d130 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 19 16:40:54 crc kubenswrapper[4918]: body: Mar 19 16:40:54 crc kubenswrapper[4918]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:54.233295152 +0000 UTC m=+6.355494400,LastTimestamp:2026-03-19 16:39:54.233295152 +0000 UTC m=+6.355494400,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 16:40:54 crc kubenswrapper[4918]: > Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.122236 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b9221b4ace2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:54.233351394 +0000 UTC m=+6.355550642,LastTimestamp:2026-03-19 16:39:54.233351394 +0000 UTC m=+6.355550642,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.130188 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 16:40:54 crc kubenswrapper[4918]: &Event{ObjectMeta:{kube-apiserver-crc.189e4b944d1a4ce8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 16:40:54 crc kubenswrapper[4918]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 16:40:54 crc kubenswrapper[4918]: Mar 19 16:40:54 crc kubenswrapper[4918]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:03.551366376 +0000 UTC m=+15.673565674,LastTimestamp:2026-03-19 16:40:03.551366376 +0000 UTC m=+15.673565674,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 16:40:54 crc kubenswrapper[4918]: > Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.135134 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b944d1ecf5a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:03.551661914 +0000 UTC m=+15.673861202,LastTimestamp:2026-03-19 16:40:03.551661914 +0000 UTC m=+15.673861202,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.141387 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e4b944d1a4ce8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 16:40:54 crc kubenswrapper[4918]: &Event{ObjectMeta:{kube-apiserver-crc.189e4b944d1a4ce8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 16:40:54 crc kubenswrapper[4918]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 16:40:54 crc kubenswrapper[4918]: Mar 19 16:40:54 crc kubenswrapper[4918]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:03.551366376 +0000 UTC m=+15.673565674,LastTimestamp:2026-03-19 16:40:03.559488871 +0000 UTC m=+15.681688149,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 16:40:54 crc kubenswrapper[4918]: > Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.147571 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e4b944d1ecf5a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b944d1ecf5a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:03.551661914 +0000 UTC m=+15.673861202,LastTimestamp:2026-03-19 16:40:03.559613635 +0000 UTC m=+15.681812923,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.152067 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e4b9180328003\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b9180328003 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.523688451 +0000 UTC m=+3.645887709,LastTimestamp:2026-03-19 16:40:03.720647683 +0000 UTC m=+15.842846941,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.156853 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e4b918ceb476f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b918ceb476f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.737124719 +0000 UTC m=+3.859323967,LastTimestamp:2026-03-19 16:40:03.874212082 +0000 UTC m=+15.996411340,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.163897 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e4b918da632f2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e4b918da632f2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:51.749374706 +0000 UTC m=+3.871573954,LastTimestamp:2026-03-19 16:40:03.889165098 +0000 UTC m=+16.011364346,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.167918 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 16:40:54 crc kubenswrapper[4918]: &Event{ObjectMeta:{kube-controller-manager-crc.189e4b9475bf0a4e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 16:40:54 crc kubenswrapper[4918]: body: Mar 19 16:40:54 crc kubenswrapper[4918]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:04.233251406 +0000 UTC m=+16.355450694,LastTimestamp:2026-03-19 16:40:04.233251406 +0000 UTC m=+16.355450694,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 16:40:54 crc kubenswrapper[4918]: > Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.171993 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b9475c01075 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:04.233318517 +0000 UTC m=+16.355517805,LastTimestamp:2026-03-19 16:40:04.233318517 +0000 UTC m=+16.355517805,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.179006 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e4b9475bf0a4e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 16:40:54 crc kubenswrapper[4918]: &Event{ObjectMeta:{kube-controller-manager-crc.189e4b9475bf0a4e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 16:40:54 crc kubenswrapper[4918]: body: Mar 19 16:40:54 crc kubenswrapper[4918]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:04.233251406 +0000 UTC m=+16.355450694,LastTimestamp:2026-03-19 16:40:14.233073273 +0000 UTC m=+26.355272531,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 16:40:54 crc kubenswrapper[4918]: > Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.182572 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e4b9475c01075\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b9475c01075 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:04.233318517 +0000 UTC m=+16.355517805,LastTimestamp:2026-03-19 16:40:14.233133175 +0000 UTC m=+26.355332433,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.186675 4918 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b96ca1b4113 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:14.238515475 +0000 UTC m=+26.360714763,LastTimestamp:2026-03-19 16:40:14.238515475 +0000 UTC m=+26.360714763,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.188808 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e4b911507e574\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b911507e574 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:49.72573426 +0000 UTC m=+1.847933518,LastTimestamp:2026-03-19 16:40:14.384817322 +0000 UTC m=+26.507016580,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.192710 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e4b91271af589\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b91271af589 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.028973449 +0000 UTC m=+2.151172697,LastTimestamp:2026-03-19 16:40:14.614475088 +0000 UTC m=+26.736674376,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.195825 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e4b9127d3a1b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b9127d3a1b8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:39:50.041076152 +0000 UTC m=+2.163275440,LastTimestamp:2026-03-19 16:40:14.62570575 +0000 UTC m=+26.747905038,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.200876 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e4b9475bf0a4e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 16:40:54 crc kubenswrapper[4918]: &Event{ObjectMeta:{kube-controller-manager-crc.189e4b9475bf0a4e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 16:40:54 crc kubenswrapper[4918]: body: Mar 19 16:40:54 crc kubenswrapper[4918]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:04.233251406 +0000 UTC m=+16.355450694,LastTimestamp:2026-03-19 16:40:24.234223674 +0000 UTC m=+36.356422972,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 16:40:54 crc kubenswrapper[4918]: > Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.205677 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e4b9475c01075\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e4b9475c01075 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:04.233318517 +0000 UTC m=+16.355517805,LastTimestamp:2026-03-19 16:40:24.23442803 +0000 UTC m=+36.356627308,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:40:54 crc kubenswrapper[4918]: E0319 16:40:54.210993 4918 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e4b9475bf0a4e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 16:40:54 crc kubenswrapper[4918]: &Event{ObjectMeta:{kube-controller-manager-crc.189e4b9475bf0a4e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 16:40:54 crc kubenswrapper[4918]: body: Mar 19 16:40:54 crc kubenswrapper[4918]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:40:04.233251406 +0000 UTC m=+16.355450694,LastTimestamp:2026-03-19 16:40:34.233702857 +0000 UTC m=+46.355902145,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 16:40:54 crc kubenswrapper[4918]: > Mar 19 16:40:54 crc kubenswrapper[4918]: I0319 16:40:54.508079 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:54 crc kubenswrapper[4918]: I0319 16:40:54.941498 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:54 crc kubenswrapper[4918]: I0319 16:40:54.942407 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:54 crc kubenswrapper[4918]: I0319 16:40:54.942454 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:54 crc kubenswrapper[4918]: I0319 16:40:54.942466 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:55 crc kubenswrapper[4918]: I0319 16:40:55.509970 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:56 crc kubenswrapper[4918]: I0319 16:40:56.510955 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:56 crc kubenswrapper[4918]: I0319 16:40:56.693488 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:40:56 crc kubenswrapper[4918]: I0319 16:40:56.693755 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:56 crc kubenswrapper[4918]: I0319 16:40:56.695076 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:56 crc kubenswrapper[4918]: I0319 16:40:56.695134 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:56 crc kubenswrapper[4918]: I0319 16:40:56.695149 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:56 crc kubenswrapper[4918]: I0319 16:40:56.696109 4918 scope.go:117] "RemoveContainer" containerID="f52a359a1ac292a614b20c79a490412ec1b7e37ecf7dfc7576babdc09dfe0ea2" Mar 19 16:40:56 crc kubenswrapper[4918]: E0319 16:40:56.696282 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:40:57 crc kubenswrapper[4918]: I0319 16:40:57.383540 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:40:57 crc kubenswrapper[4918]: I0319 16:40:57.383735 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:57 crc kubenswrapper[4918]: I0319 16:40:57.384953 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:57 crc kubenswrapper[4918]: I0319 16:40:57.385040 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:57 crc kubenswrapper[4918]: I0319 16:40:57.385056 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:57 crc kubenswrapper[4918]: I0319 16:40:57.385929 4918 scope.go:117] "RemoveContainer" containerID="f52a359a1ac292a614b20c79a490412ec1b7e37ecf7dfc7576babdc09dfe0ea2" Mar 19 16:40:57 crc kubenswrapper[4918]: E0319 16:40:57.386204 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:40:57 crc kubenswrapper[4918]: I0319 16:40:57.511070 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:58 crc kubenswrapper[4918]: I0319 16:40:58.512415 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:40:58 crc kubenswrapper[4918]: E0319 16:40:58.687107 4918 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 16:40:58 crc kubenswrapper[4918]: E0319 16:40:58.999633 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 16:40:59 crc kubenswrapper[4918]: I0319 16:40:59.004862 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:40:59 crc kubenswrapper[4918]: I0319 16:40:59.006562 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:40:59 crc kubenswrapper[4918]: I0319 16:40:59.006697 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:40:59 crc kubenswrapper[4918]: I0319 16:40:59.006808 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:40:59 crc kubenswrapper[4918]: I0319 16:40:59.006916 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:40:59 crc kubenswrapper[4918]: E0319 16:40:59.013302 4918 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 16:40:59 crc kubenswrapper[4918]: I0319 16:40:59.509028 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:41:00 crc kubenswrapper[4918]: I0319 16:41:00.511317 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:41:00 crc kubenswrapper[4918]: I0319 16:41:00.518499 4918 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 16:41:00 crc kubenswrapper[4918]: I0319 16:41:00.537226 4918 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 16:41:01 crc kubenswrapper[4918]: I0319 16:41:01.513481 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:41:02 crc kubenswrapper[4918]: I0319 16:41:02.514094 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:41:03 crc kubenswrapper[4918]: I0319 16:41:03.510495 4918 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 16:41:03 crc kubenswrapper[4918]: I0319 16:41:03.678600 4918 csr.go:261] certificate signing request csr-vf97g is approved, waiting to be issued Mar 19 16:41:03 crc kubenswrapper[4918]: I0319 16:41:03.691349 4918 csr.go:257] certificate signing request csr-vf97g is issued Mar 19 16:41:03 crc kubenswrapper[4918]: I0319 16:41:03.807718 4918 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 16:41:04 crc kubenswrapper[4918]: I0319 16:41:04.343679 4918 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 16:41:04 crc kubenswrapper[4918]: I0319 16:41:04.693375 4918 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-11 18:51:02.412408008 +0000 UTC Mar 19 16:41:04 crc kubenswrapper[4918]: I0319 16:41:04.693464 4918 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7154h9m57.718949499s for next certificate rotation Mar 19 16:41:05 crc kubenswrapper[4918]: I0319 16:41:05.203137 4918 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.014134 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.017623 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.017700 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.017717 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.017847 4918 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.028382 4918 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.028864 4918 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.028906 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.033214 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.033460 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.033679 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.033885 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.034041 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:06Z","lastTransitionTime":"2026-03-19T16:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.054735 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.059814 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.060108 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.060293 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.060486 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.060754 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:06Z","lastTransitionTime":"2026-03-19T16:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.074665 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.080622 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.080677 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.080693 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.080715 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.080731 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:06Z","lastTransitionTime":"2026-03-19T16:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.096682 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.102405 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.102445 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.102458 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.102476 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:06 crc kubenswrapper[4918]: I0319 16:41:06.102489 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:06Z","lastTransitionTime":"2026-03-19T16:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.118404 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.118593 4918 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.118632 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.218964 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.319919 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.420769 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.521446 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.621839 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.723289 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.824326 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:06 crc kubenswrapper[4918]: E0319 16:41:06.925824 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:07 crc kubenswrapper[4918]: E0319 16:41:07.026741 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:07 crc kubenswrapper[4918]: E0319 16:41:07.127736 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:07 crc kubenswrapper[4918]: E0319 16:41:07.228802 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:07 crc kubenswrapper[4918]: E0319 16:41:07.330046 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:07 crc kubenswrapper[4918]: E0319 16:41:07.430727 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:07 crc kubenswrapper[4918]: E0319 16:41:07.531963 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:07 crc kubenswrapper[4918]: E0319 16:41:07.632491 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:07 crc kubenswrapper[4918]: E0319 16:41:07.733329 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:07 crc kubenswrapper[4918]: E0319 16:41:07.833990 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:07 crc kubenswrapper[4918]: E0319 16:41:07.934569 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.035100 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.135583 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.236987 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.337581 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.437900 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.538934 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:08 crc kubenswrapper[4918]: I0319 16:41:08.586146 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:41:08 crc kubenswrapper[4918]: I0319 16:41:08.588034 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:08 crc kubenswrapper[4918]: I0319 16:41:08.588125 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:08 crc kubenswrapper[4918]: I0319 16:41:08.588141 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:08 crc kubenswrapper[4918]: I0319 16:41:08.589042 4918 scope.go:117] "RemoveContainer" containerID="f52a359a1ac292a614b20c79a490412ec1b7e37ecf7dfc7576babdc09dfe0ea2" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.589255 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.640137 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.687326 4918 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.740360 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.840746 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:08 crc kubenswrapper[4918]: E0319 16:41:08.941936 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:09 crc kubenswrapper[4918]: E0319 16:41:09.043071 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:09 crc kubenswrapper[4918]: E0319 16:41:09.144220 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:09 crc kubenswrapper[4918]: E0319 16:41:09.245096 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:09 crc kubenswrapper[4918]: E0319 16:41:09.346287 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:09 crc kubenswrapper[4918]: E0319 16:41:09.447088 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:09 crc kubenswrapper[4918]: E0319 16:41:09.547582 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:09 crc kubenswrapper[4918]: E0319 16:41:09.648261 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:09 crc kubenswrapper[4918]: E0319 16:41:09.748870 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:09 crc kubenswrapper[4918]: E0319 16:41:09.849419 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:09 crc kubenswrapper[4918]: E0319 16:41:09.950139 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:10 crc kubenswrapper[4918]: E0319 16:41:10.050898 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:10 crc kubenswrapper[4918]: E0319 16:41:10.151979 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:10 crc kubenswrapper[4918]: E0319 16:41:10.252774 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:10 crc kubenswrapper[4918]: I0319 16:41:10.336309 4918 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 16:41:10 crc kubenswrapper[4918]: E0319 16:41:10.353154 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:10 crc kubenswrapper[4918]: E0319 16:41:10.453763 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:10 crc kubenswrapper[4918]: E0319 16:41:10.554216 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:10 crc kubenswrapper[4918]: I0319 16:41:10.585817 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:41:10 crc kubenswrapper[4918]: I0319 16:41:10.587930 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:10 crc kubenswrapper[4918]: I0319 16:41:10.587997 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:10 crc kubenswrapper[4918]: I0319 16:41:10.588025 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:10 crc kubenswrapper[4918]: E0319 16:41:10.654914 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:10 crc kubenswrapper[4918]: E0319 16:41:10.755599 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:10 crc kubenswrapper[4918]: E0319 16:41:10.856239 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:10 crc kubenswrapper[4918]: E0319 16:41:10.957389 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:11 crc kubenswrapper[4918]: E0319 16:41:11.058545 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:11 crc kubenswrapper[4918]: E0319 16:41:11.159027 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:11 crc kubenswrapper[4918]: E0319 16:41:11.259152 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:11 crc kubenswrapper[4918]: E0319 16:41:11.359737 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:11 crc kubenswrapper[4918]: E0319 16:41:11.460873 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:11 crc kubenswrapper[4918]: E0319 16:41:11.562050 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:11 crc kubenswrapper[4918]: E0319 16:41:11.663170 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:11 crc kubenswrapper[4918]: E0319 16:41:11.764227 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:11 crc kubenswrapper[4918]: E0319 16:41:11.865433 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:11 crc kubenswrapper[4918]: E0319 16:41:11.966682 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:12 crc kubenswrapper[4918]: E0319 16:41:12.066892 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:12 crc kubenswrapper[4918]: E0319 16:41:12.167648 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:12 crc kubenswrapper[4918]: E0319 16:41:12.267764 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:12 crc kubenswrapper[4918]: E0319 16:41:12.368019 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:12 crc kubenswrapper[4918]: E0319 16:41:12.468185 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:12 crc kubenswrapper[4918]: E0319 16:41:12.569135 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:12 crc kubenswrapper[4918]: E0319 16:41:12.669354 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:12 crc kubenswrapper[4918]: E0319 16:41:12.769905 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:12 crc kubenswrapper[4918]: E0319 16:41:12.871060 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:12 crc kubenswrapper[4918]: I0319 16:41:12.881392 4918 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 16:41:12 crc kubenswrapper[4918]: E0319 16:41:12.971389 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:13 crc kubenswrapper[4918]: E0319 16:41:13.072487 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:13 crc kubenswrapper[4918]: E0319 16:41:13.173295 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:13 crc kubenswrapper[4918]: E0319 16:41:13.274241 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:13 crc kubenswrapper[4918]: E0319 16:41:13.375258 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:13 crc kubenswrapper[4918]: E0319 16:41:13.476197 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:13 crc kubenswrapper[4918]: E0319 16:41:13.577060 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:13 crc kubenswrapper[4918]: E0319 16:41:13.677200 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:13 crc kubenswrapper[4918]: E0319 16:41:13.778292 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:13 crc kubenswrapper[4918]: E0319 16:41:13.878702 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:13 crc kubenswrapper[4918]: E0319 16:41:13.979477 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:14 crc kubenswrapper[4918]: E0319 16:41:14.080292 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:14 crc kubenswrapper[4918]: E0319 16:41:14.180876 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:14 crc kubenswrapper[4918]: E0319 16:41:14.282106 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:14 crc kubenswrapper[4918]: E0319 16:41:14.382400 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:14 crc kubenswrapper[4918]: E0319 16:41:14.483488 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:14 crc kubenswrapper[4918]: E0319 16:41:14.584092 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:14 crc kubenswrapper[4918]: E0319 16:41:14.684493 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:14 crc kubenswrapper[4918]: E0319 16:41:14.785729 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:14 crc kubenswrapper[4918]: E0319 16:41:14.886086 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:14 crc kubenswrapper[4918]: E0319 16:41:14.987237 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:15 crc kubenswrapper[4918]: E0319 16:41:15.087780 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:15 crc kubenswrapper[4918]: E0319 16:41:15.188348 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:15 crc kubenswrapper[4918]: E0319 16:41:15.289112 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:15 crc kubenswrapper[4918]: E0319 16:41:15.389953 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:15 crc kubenswrapper[4918]: E0319 16:41:15.490872 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:15 crc kubenswrapper[4918]: E0319 16:41:15.591492 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:15 crc kubenswrapper[4918]: E0319 16:41:15.692224 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:15 crc kubenswrapper[4918]: E0319 16:41:15.792961 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:15 crc kubenswrapper[4918]: E0319 16:41:15.894151 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:15 crc kubenswrapper[4918]: E0319 16:41:15.995261 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.095816 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.135425 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.142514 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.142583 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.142641 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.142667 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.142684 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:16Z","lastTransitionTime":"2026-03-19T16:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.160212 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.166926 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.166992 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.167009 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.167031 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.167049 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:16Z","lastTransitionTime":"2026-03-19T16:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.184643 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.191413 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.191473 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.191489 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.191510 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.191548 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:16Z","lastTransitionTime":"2026-03-19T16:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.210450 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.216005 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.216264 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.216405 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.216637 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:16 crc kubenswrapper[4918]: I0319 16:41:16.216832 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:16Z","lastTransitionTime":"2026-03-19T16:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.235155 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.235369 4918 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.235412 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.336324 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.437060 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.537629 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.638603 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.739134 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.839636 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:16 crc kubenswrapper[4918]: E0319 16:41:16.940424 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:17 crc kubenswrapper[4918]: E0319 16:41:17.040967 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:17 crc kubenswrapper[4918]: E0319 16:41:17.141174 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:17 crc kubenswrapper[4918]: E0319 16:41:17.243244 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:17 crc kubenswrapper[4918]: E0319 16:41:17.343736 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:17 crc kubenswrapper[4918]: E0319 16:41:17.444790 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:17 crc kubenswrapper[4918]: E0319 16:41:17.545569 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:17 crc kubenswrapper[4918]: E0319 16:41:17.646310 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:17 crc kubenswrapper[4918]: E0319 16:41:17.747889 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:17 crc kubenswrapper[4918]: E0319 16:41:17.848649 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:17 crc kubenswrapper[4918]: E0319 16:41:17.949669 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:18 crc kubenswrapper[4918]: E0319 16:41:18.050439 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:18 crc kubenswrapper[4918]: E0319 16:41:18.151020 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:18 crc kubenswrapper[4918]: E0319 16:41:18.252271 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:18 crc kubenswrapper[4918]: E0319 16:41:18.352630 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:18 crc kubenswrapper[4918]: E0319 16:41:18.453500 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:18 crc kubenswrapper[4918]: E0319 16:41:18.554117 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:18 crc kubenswrapper[4918]: E0319 16:41:18.654916 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:18 crc kubenswrapper[4918]: E0319 16:41:18.688824 4918 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 16:41:18 crc kubenswrapper[4918]: E0319 16:41:18.756056 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:18 crc kubenswrapper[4918]: E0319 16:41:18.856571 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:18 crc kubenswrapper[4918]: E0319 16:41:18.957991 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:19 crc kubenswrapper[4918]: E0319 16:41:19.058325 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:19 crc kubenswrapper[4918]: E0319 16:41:19.158914 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:19 crc kubenswrapper[4918]: E0319 16:41:19.259758 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:19 crc kubenswrapper[4918]: E0319 16:41:19.360949 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:19 crc kubenswrapper[4918]: E0319 16:41:19.461828 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:19 crc kubenswrapper[4918]: E0319 16:41:19.562272 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:19 crc kubenswrapper[4918]: E0319 16:41:19.663385 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:19 crc kubenswrapper[4918]: E0319 16:41:19.763976 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:19 crc kubenswrapper[4918]: E0319 16:41:19.864750 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:19 crc kubenswrapper[4918]: E0319 16:41:19.965831 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:20 crc kubenswrapper[4918]: E0319 16:41:20.066762 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:20 crc kubenswrapper[4918]: E0319 16:41:20.168301 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:20 crc kubenswrapper[4918]: E0319 16:41:20.270440 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:20 crc kubenswrapper[4918]: E0319 16:41:20.371141 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:20 crc kubenswrapper[4918]: E0319 16:41:20.472162 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:20 crc kubenswrapper[4918]: E0319 16:41:20.572335 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:20 crc kubenswrapper[4918]: I0319 16:41:20.585882 4918 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 16:41:20 crc kubenswrapper[4918]: I0319 16:41:20.587332 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:20 crc kubenswrapper[4918]: I0319 16:41:20.587367 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:20 crc kubenswrapper[4918]: I0319 16:41:20.587378 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:20 crc kubenswrapper[4918]: I0319 16:41:20.588092 4918 scope.go:117] "RemoveContainer" containerID="f52a359a1ac292a614b20c79a490412ec1b7e37ecf7dfc7576babdc09dfe0ea2" Mar 19 16:41:20 crc kubenswrapper[4918]: E0319 16:41:20.588277 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 16:41:20 crc kubenswrapper[4918]: E0319 16:41:20.673417 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:20 crc kubenswrapper[4918]: E0319 16:41:20.774586 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:20 crc kubenswrapper[4918]: E0319 16:41:20.874845 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:20 crc kubenswrapper[4918]: E0319 16:41:20.975343 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:21 crc kubenswrapper[4918]: E0319 16:41:21.076254 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:21 crc kubenswrapper[4918]: E0319 16:41:21.176618 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:21 crc kubenswrapper[4918]: E0319 16:41:21.277106 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:21 crc kubenswrapper[4918]: E0319 16:41:21.377451 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:21 crc kubenswrapper[4918]: E0319 16:41:21.478335 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:21 crc kubenswrapper[4918]: E0319 16:41:21.579280 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:21 crc kubenswrapper[4918]: E0319 16:41:21.680365 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:21 crc kubenswrapper[4918]: E0319 16:41:21.780869 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:21 crc kubenswrapper[4918]: E0319 16:41:21.881512 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:21 crc kubenswrapper[4918]: E0319 16:41:21.981706 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:22 crc kubenswrapper[4918]: E0319 16:41:22.082321 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:22 crc kubenswrapper[4918]: E0319 16:41:22.182900 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:22 crc kubenswrapper[4918]: E0319 16:41:22.283962 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:22 crc kubenswrapper[4918]: E0319 16:41:22.384121 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:22 crc kubenswrapper[4918]: E0319 16:41:22.485206 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:22 crc kubenswrapper[4918]: E0319 16:41:22.585538 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:22 crc kubenswrapper[4918]: E0319 16:41:22.686481 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:22 crc kubenswrapper[4918]: E0319 16:41:22.787769 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:22 crc kubenswrapper[4918]: E0319 16:41:22.887932 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:22 crc kubenswrapper[4918]: E0319 16:41:22.989082 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:23 crc kubenswrapper[4918]: E0319 16:41:23.089511 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:23 crc kubenswrapper[4918]: E0319 16:41:23.190185 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:23 crc kubenswrapper[4918]: E0319 16:41:23.291336 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:23 crc kubenswrapper[4918]: E0319 16:41:23.392650 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:23 crc kubenswrapper[4918]: E0319 16:41:23.492875 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:23 crc kubenswrapper[4918]: E0319 16:41:23.593445 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:23 crc kubenswrapper[4918]: E0319 16:41:23.693613 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:23 crc kubenswrapper[4918]: E0319 16:41:23.794415 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:23 crc kubenswrapper[4918]: E0319 16:41:23.894592 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:23 crc kubenswrapper[4918]: E0319 16:41:23.995668 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.096719 4918 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.147227 4918 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.199719 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.200516 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.200633 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.200718 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.200810 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:24Z","lastTransitionTime":"2026-03-19T16:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.304185 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.304242 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.304254 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.304273 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.304287 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:24Z","lastTransitionTime":"2026-03-19T16:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.406455 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.406547 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.406625 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.406661 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.406680 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:24Z","lastTransitionTime":"2026-03-19T16:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.510690 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.510783 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.510803 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.510836 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.510860 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:24Z","lastTransitionTime":"2026-03-19T16:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.542100 4918 apiserver.go:52] "Watching apiserver" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.547093 4918 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.547638 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.548170 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.548253 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.548301 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.548720 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.548808 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.548971 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.549233 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.549470 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.549543 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.551015 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.551213 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.551813 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.552110 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.552267 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.552310 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.553740 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.554124 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.557995 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.583865 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.605195 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.610336 4918 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.613880 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.613915 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.613929 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.613947 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.613962 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:24Z","lastTransitionTime":"2026-03-19T16:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.620357 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.633600 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.653334 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659011 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659067 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659087 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659106 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659123 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659148 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659165 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659184 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659200 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659216 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659391 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659415 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659432 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659452 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659472 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659492 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659507 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659545 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659560 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659579 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659596 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659570 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659612 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659717 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659748 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659778 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659806 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659831 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659858 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659882 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659892 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659905 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659935 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659959 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.659989 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660016 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660136 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660164 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660167 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660216 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660266 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660285 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660309 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660328 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660348 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660367 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660386 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660387 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660407 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660390 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660434 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660479 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660499 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660514 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660544 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660564 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660580 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660599 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660617 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660639 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660661 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660660 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660665 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660680 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660749 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660779 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660802 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660820 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660836 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660865 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660842 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660903 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.660898 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661037 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661106 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661166 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661219 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661265 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661315 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661372 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661428 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661477 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661571 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661627 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661676 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661729 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661778 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661826 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661891 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661928 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661963 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662014 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662068 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662106 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662149 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662198 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662241 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662285 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662323 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662357 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662394 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662438 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662474 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662508 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.663849 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.663925 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.663980 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664033 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664227 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664295 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664356 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664413 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664504 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664600 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664649 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664697 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664759 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664808 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664859 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664914 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.664965 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665016 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665074 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665125 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665169 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665732 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665785 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665823 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665860 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665896 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665934 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665979 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.668241 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661112 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.676345 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.676408 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.676359 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661484 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661571 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661695 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661913 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662694 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.662748 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.663010 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.663616 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.663935 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.665669 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.676735 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.666343 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.666372 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.667503 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.667700 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.667739 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.667893 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.667867 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.668034 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.668320 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.668653 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.669083 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.669182 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.669422 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.669484 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.669618 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.670954 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.671682 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.671830 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.671925 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.671941 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.672124 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.672553 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.672698 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.672768 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.672816 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.673751 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.673872 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.673980 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.674105 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.674123 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.674139 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.674351 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.674926 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.675042 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.675087 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.675229 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.675229 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.675326 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.675599 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.661222 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677323 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677321 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677382 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.676099 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.676290 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.666111 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:25.166080697 +0000 UTC m=+97.288279965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677564 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677624 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677689 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677737 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677792 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677836 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677874 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677878 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677904 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.675838 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677922 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677959 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.677994 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.678031 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.678068 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.678105 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.675681 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.678140 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.678243 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.675743 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.679016 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.679635 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.679696 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.679734 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.679775 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.679816 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.679873 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.679924 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.679963 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.680014 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.681688 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.681733 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.681767 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.681803 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.681847 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.681880 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682750 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682825 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682860 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682895 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682927 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682994 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683427 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683482 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683517 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683577 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683610 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683644 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683685 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683703 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683727 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683957 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683993 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684022 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684050 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684080 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684114 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684142 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684170 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684200 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684231 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684254 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684276 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684298 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684321 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684344 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684377 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684408 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684425 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684444 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684464 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684512 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684547 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684566 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684586 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684605 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684624 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684646 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684666 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684686 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684710 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684731 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684785 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684810 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684835 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684861 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684886 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684905 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684924 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684945 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684965 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684988 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685009 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685028 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685050 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685071 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685157 4918 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685171 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685183 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685195 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685205 4918 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685215 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685224 4918 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685235 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685246 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685257 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685268 4918 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685278 4918 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685287 4918 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685296 4918 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685306 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685315 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685325 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685335 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685345 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685354 4918 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685364 4918 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685373 4918 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685382 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685391 4918 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685401 4918 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685410 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685420 4918 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685434 4918 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685445 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685459 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685468 4918 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685478 4918 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685487 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685497 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685507 4918 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685518 4918 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685546 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685558 4918 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685569 4918 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685580 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685590 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685600 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685610 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685621 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685631 4918 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685642 4918 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685652 4918 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685663 4918 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685673 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685685 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685694 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685705 4918 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685715 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685725 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685735 4918 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685745 4918 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685755 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685764 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685776 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685789 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685797 4918 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685808 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685818 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685827 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685837 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685846 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685856 4918 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685865 4918 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685875 4918 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685884 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685894 4918 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685903 4918 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685913 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685923 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685933 4918 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685942 4918 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685952 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685962 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.680338 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.680509 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.680768 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.686345 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.681064 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.681137 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.681185 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.681616 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.686393 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.686427 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.681628 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682121 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682147 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682236 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682406 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682568 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.682559 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683268 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683383 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.683683 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.684602 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.685084 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.686032 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.686037 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.686667 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.686884 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.687078 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.687265 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.687875 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.687998 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.688009 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.688421 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.688735 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.688936 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.688946 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.689028 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.689086 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.689139 4918 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.689301 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:25.189240342 +0000 UTC m=+97.311439810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.689303 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.689441 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.689687 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.689762 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.689757 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.689843 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.690147 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.690061 4918 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.691092 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.691223 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.691656 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.691771 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.691978 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.692096 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.692110 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.692131 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.692343 4918 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.692432 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:25.192407396 +0000 UTC m=+97.314606884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.692852 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.692908 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.693016 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.693304 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.693621 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.694139 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.694209 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.694339 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.694351 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.694751 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.694876 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.694983 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.695041 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.695082 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.695095 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.695141 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.695105 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.695803 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.695841 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.696668 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.696706 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.696714 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.696844 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.697295 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.697325 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.697386 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.697547 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.698064 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.698106 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.698120 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.699218 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.699674 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.702279 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.704290 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.704313 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.704663 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.704693 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.704823 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.707340 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.712926 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.713082 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.714037 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.714183 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.714321 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.716319 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.716349 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.716470 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.716491 4918 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.716600 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:25.216578262 +0000 UTC m=+97.338777730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.716722 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.717024 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.717047 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.717056 4918 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:24 crc kubenswrapper[4918]: E0319 16:41:24.717108 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:25.217095967 +0000 UTC m=+97.339295215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.717772 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.717833 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.718799 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.719166 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.719196 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.719255 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.719287 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.719466 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.719931 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.720205 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.721057 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.721577 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.721618 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.721788 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.721822 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.721834 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.721853 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.721868 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:24Z","lastTransitionTime":"2026-03-19T16:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.721928 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.721914 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.721981 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.722581 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.722842 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.726288 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.726508 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.726634 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.727509 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.732471 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.734277 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.744347 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.751588 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.755187 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.787227 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.787398 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.787497 4918 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.787408 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.787446 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.787577 4918 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.787764 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.787823 4918 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.787878 4918 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.787929 4918 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.787986 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788048 4918 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788119 4918 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788177 4918 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788229 4918 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788282 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788342 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788398 4918 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788451 4918 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788501 4918 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788569 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788659 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788727 4918 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788784 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788844 4918 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788894 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788943 4918 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.788997 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789060 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789128 4918 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789187 4918 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789238 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789291 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789590 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789673 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789724 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789782 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789833 4918 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789887 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.789946 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790002 4918 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790073 4918 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790136 4918 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790189 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790239 4918 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790295 4918 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790349 4918 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790403 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790461 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790533 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790601 4918 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790658 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790717 4918 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790777 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790840 4918 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790898 4918 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.790956 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791012 4918 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791070 4918 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791121 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791174 4918 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791228 4918 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791287 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791339 4918 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791390 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791442 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791492 4918 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791564 4918 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791626 4918 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791683 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791734 4918 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791787 4918 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791841 4918 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791896 4918 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.791950 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792003 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792064 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792135 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792189 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792279 4918 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792336 4918 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792388 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792438 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792492 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792563 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792637 4918 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792693 4918 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792749 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792800 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792860 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792919 4918 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.792976 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793027 4918 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793109 4918 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793166 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793228 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793283 4918 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793338 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793396 4918 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793446 4918 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793495 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793566 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793620 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793678 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793728 4918 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793778 4918 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793833 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793889 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.793945 4918 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.794000 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.794051 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.794105 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.794160 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.794212 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.794265 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.794317 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.794372 4918 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.794428 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.794483 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.794548 4918 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.824266 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.824331 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.824348 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.824370 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.824385 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:24Z","lastTransitionTime":"2026-03-19T16:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.871039 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.881007 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.889593 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 16:41:24 crc kubenswrapper[4918]: W0319 16:41:24.893507 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ac736f5586bd6932872a786e7a74875ec35e0edb7c76ecf8ebc3ccbec76706bf WatchSource:0}: Error finding container ac736f5586bd6932872a786e7a74875ec35e0edb7c76ecf8ebc3ccbec76706bf: Status 404 returned error can't find the container with id ac736f5586bd6932872a786e7a74875ec35e0edb7c76ecf8ebc3ccbec76706bf Mar 19 16:41:24 crc kubenswrapper[4918]: W0319 16:41:24.906064 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-ca27dd853a8ad286d7a1429dc01576e5c5c67e66282070cfc6dbf9ac15bb6fc0 WatchSource:0}: Error finding container ca27dd853a8ad286d7a1429dc01576e5c5c67e66282070cfc6dbf9ac15bb6fc0: Status 404 returned error can't find the container with id ca27dd853a8ad286d7a1429dc01576e5c5c67e66282070cfc6dbf9ac15bb6fc0 Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.928341 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.928387 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.928397 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.928418 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:24 crc kubenswrapper[4918]: I0319 16:41:24.928428 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:24Z","lastTransitionTime":"2026-03-19T16:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.031677 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.031741 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.031756 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.031780 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.031796 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:25Z","lastTransitionTime":"2026-03-19T16:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.043981 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ca27dd853a8ad286d7a1429dc01576e5c5c67e66282070cfc6dbf9ac15bb6fc0"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.044937 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ac736f5586bd6932872a786e7a74875ec35e0edb7c76ecf8ebc3ccbec76706bf"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.046628 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c5c8360fc00b8b8af37362e0ac8db85760187c369f6d1f5850d74b7d62b20caa"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.134326 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.134366 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.134376 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.134391 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.134401 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:25Z","lastTransitionTime":"2026-03-19T16:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.197419 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.197537 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.197596 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:26.197571928 +0000 UTC m=+98.319771176 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.197652 4918 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.197655 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.197694 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:26.197683431 +0000 UTC m=+98.319882679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.197785 4918 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.197824 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:26.197813995 +0000 UTC m=+98.320013243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.237762 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.237815 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.237837 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.237862 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.237881 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:25Z","lastTransitionTime":"2026-03-19T16:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.298637 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.298705 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.298909 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.298939 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.298960 4918 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.299050 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:26.299025511 +0000 UTC m=+98.421224799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.299065 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.299154 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.299177 4918 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.299297 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:26.299262878 +0000 UTC m=+98.421462156 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.341392 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.341439 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.341455 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.341479 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.341501 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:25Z","lastTransitionTime":"2026-03-19T16:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.444906 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.444989 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.445014 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.445046 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.445070 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:25Z","lastTransitionTime":"2026-03-19T16:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.548460 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.548516 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.548572 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.548601 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.548619 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:25Z","lastTransitionTime":"2026-03-19T16:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.586428 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:25 crc kubenswrapper[4918]: E0319 16:41:25.586686 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.651809 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.651881 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.651904 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.651931 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.651950 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:25Z","lastTransitionTime":"2026-03-19T16:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.754676 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.754739 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.754759 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.754784 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.754799 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:25Z","lastTransitionTime":"2026-03-19T16:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.858652 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.858710 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.858721 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.858741 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.858756 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:25Z","lastTransitionTime":"2026-03-19T16:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.961660 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.961723 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.961740 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.961762 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:25 crc kubenswrapper[4918]: I0319 16:41:25.961778 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:25Z","lastTransitionTime":"2026-03-19T16:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.051727 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"791144d399ce77d3150c44c91664c5148bcd40c99e23e0e2ff1245946a67e12b"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.055007 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3ce181300415b057d8a46c3c3e17b5c65d2fbeb3c48831758f03df7672e31cf4"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.055057 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"663e55d3611c1618d648bf72dad19268812eea6ad904d4f61052358a175deb5f"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.064818 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.065170 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.065253 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.065347 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.065429 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.072057 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.089632 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.108729 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.133253 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.148112 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.166646 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://791144d399ce77d3150c44c91664c5148bcd40c99e23e0e2ff1245946a67e12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T16:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.169270 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.169390 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.169467 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.169565 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.169644 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.180476 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://791144d399ce77d3150c44c91664c5148bcd40c99e23e0e2ff1245946a67e12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T16:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.197314 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.208466 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.208923 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.209083 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.209655 4918 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.209743 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:28.209700625 +0000 UTC m=+100.331899873 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.209656 4918 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.209851 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:28.209815968 +0000 UTC m=+100.332015216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.209955 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:28.209906701 +0000 UTC m=+100.332105959 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.221119 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce181300415b057d8a46c3c3e17b5c65d2fbeb3c48831758f03df7672e31cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T16:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663e55d3611c1618d648bf72dad19268812eea6ad904d4f61052358a175deb5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T16:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.235374 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.248506 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.260787 4918 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.273275 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.273682 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.273753 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.273814 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.273873 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.310469 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.310664 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.310959 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.311019 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.311037 4918 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.311116 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:28.311088746 +0000 UTC m=+100.433288084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.311298 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.311392 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.311458 4918 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.311586 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:28.31156544 +0000 UTC m=+100.433764688 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.377448 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.377785 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.377921 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.378000 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.378075 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.480982 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.481190 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.481273 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.481390 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.481457 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.503599 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.503655 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.503670 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.503691 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.503708 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.517769 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.521983 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.522079 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.522136 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.522202 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.522259 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.535425 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.539502 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.539569 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.539583 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.539618 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.539632 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.554854 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.562444 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.562620 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.562871 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.563065 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.563217 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.576009 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.580467 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.580505 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.580532 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.580559 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.580572 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.587651 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.587802 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.588196 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.588270 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.592334 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.592953 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.596670 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.597482 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.598693 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.599348 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.599977 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.600991 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.601641 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.601569 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T16:41:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d23eb629-4a93-4855-a806-6c791cece8cb\\\",\\\"systemUUID\\\":\\\"bb6fd883-4ea6-4b3c-be0c-dda5543e1953\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T16:41:26Z is after 2025-08-24T17:21:41Z" Mar 19 16:41:26 crc kubenswrapper[4918]: E0319 16:41:26.601873 4918 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.602812 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.603409 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.604232 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.604348 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.604438 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.604557 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.604631 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.605024 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.605506 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.606040 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.606957 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.607543 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.608494 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.608960 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.609517 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.610557 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.611081 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.612074 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.612735 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.613970 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.614454 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.615209 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.616428 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.616936 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.618002 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.618579 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.619496 4918 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.619629 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.621284 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.622184 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.622716 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.624219 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.624880 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.625820 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.626594 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.627743 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.628293 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.629456 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.630448 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.631503 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.632015 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.633077 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.633755 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.635219 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.635747 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.636663 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.637155 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.638299 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.638899 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.639358 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.707464 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.707539 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.707553 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.707571 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.707585 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.810598 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.810664 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.810681 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.810708 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.810729 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.913860 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.914144 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.914221 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.914302 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:26 crc kubenswrapper[4918]: I0319 16:41:26.914367 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:26Z","lastTransitionTime":"2026-03-19T16:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.017717 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.017962 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.018062 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.018135 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.018192 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:27Z","lastTransitionTime":"2026-03-19T16:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.121306 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.121378 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.121396 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.121420 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.121441 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:27Z","lastTransitionTime":"2026-03-19T16:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.224329 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.224403 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.224422 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.224450 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.224471 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:27Z","lastTransitionTime":"2026-03-19T16:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.328401 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.328473 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.328495 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.328557 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.328583 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:27Z","lastTransitionTime":"2026-03-19T16:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.432423 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.432878 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.433043 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.433204 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.433353 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:27Z","lastTransitionTime":"2026-03-19T16:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.537265 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.537644 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.537802 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.537941 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.538175 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:27Z","lastTransitionTime":"2026-03-19T16:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.586409 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:27 crc kubenswrapper[4918]: E0319 16:41:27.586677 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.608836 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.642808 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.642891 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.642912 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.642940 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.642964 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:27Z","lastTransitionTime":"2026-03-19T16:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.746792 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.747119 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.747216 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.747314 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.747408 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:27Z","lastTransitionTime":"2026-03-19T16:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.850842 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.851183 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.851407 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.851664 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.851862 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:27Z","lastTransitionTime":"2026-03-19T16:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.868136 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hzfmx"] Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.868517 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hzfmx" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.873537 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.873581 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.873713 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.881364 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-d4bjv"] Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.881729 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.883247 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.883574 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.884174 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.884579 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.885272 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-nrh6c"] Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.885538 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.886137 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.886467 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-m2sxj"] Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.887012 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.887021 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m2sxj" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.889271 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.889437 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.889621 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.889724 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.889807 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.890077 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.906095 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=0.906071274 podStartE2EDuration="906.071274ms" podCreationTimestamp="2026-03-19 16:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:27.904205228 +0000 UTC m=+100.026404486" watchObservedRunningTime="2026-03-19 16:41:27.906071274 +0000 UTC m=+100.028270522" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.910661 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g7pf8"] Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.917077 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.925471 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.925642 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.925671 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.925892 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.926303 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.926361 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.926705 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.954721 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.954769 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.954782 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.954799 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:27 crc kubenswrapper[4918]: I0319 16:41:27.954812 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:27Z","lastTransitionTime":"2026-03-19T16:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028194 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-cnibin\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028254 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-var-lib-cni-bin\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028281 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/faff5e41-8f94-4bfd-9730-38955ab099d9-proxy-tls\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028305 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-var-lib-openvswitch\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028328 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/571f6589-a451-476a-9066-d348b85a81ac-ovn-node-metrics-cert\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028349 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/17f72867-9c04-4355-b110-ecd9daabbfce-hosts-file\") pod \"node-resolver-hzfmx\" (UID: \"17f72867-9c04-4355-b110-ecd9daabbfce\") " pod="openshift-dns/node-resolver-hzfmx" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028370 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-kubelet\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028394 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-netd\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028420 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-cni-binary-copy\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028462 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-script-lib\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028492 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-slash\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028558 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-ovn\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028580 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb9hs\" (UniqueName: \"kubernetes.io/projected/17f72867-9c04-4355-b110-ecd9daabbfce-kube-api-access-zb9hs\") pod \"node-resolver-hzfmx\" (UID: \"17f72867-9c04-4355-b110-ecd9daabbfce\") " pod="openshift-dns/node-resolver-hzfmx" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028602 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-socket-dir-parent\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028624 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-var-lib-kubelet\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028650 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/faff5e41-8f94-4bfd-9730-38955ab099d9-mcd-auth-proxy-config\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028673 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-systemd-units\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028695 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-etc-openvswitch\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028718 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028742 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028765 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-daemon-config\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028786 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-etc-kubernetes\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028810 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr2sw\" (UniqueName: \"kubernetes.io/projected/faff5e41-8f94-4bfd-9730-38955ab099d9-kube-api-access-pr2sw\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028833 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-cni-dir\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028855 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-run-k8s-cni-cncf-io\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028883 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-system-cni-dir\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028913 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf410553-1d0d-4507-9029-77fe224a5d7c-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028952 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028975 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf410553-1d0d-4507-9029-77fe224a5d7c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.028996 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-node-log\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029016 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-var-lib-cni-multus\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029040 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-env-overrides\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029063 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-run-netns\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029085 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/faff5e41-8f94-4bfd-9730-38955ab099d9-rootfs\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029106 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-hostroot\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029151 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-conf-dir\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029184 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzr2k\" (UniqueName: \"kubernetes.io/projected/cf410553-1d0d-4507-9029-77fe224a5d7c-kube-api-access-mzr2k\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029209 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-netns\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029231 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-cnibin\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029254 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86wlv\" (UniqueName: \"kubernetes.io/projected/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-kube-api-access-86wlv\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029275 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-log-socket\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029297 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-openvswitch\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029318 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-config\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029341 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-os-release\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029363 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-systemd\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029383 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-bin\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029406 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l48p7\" (UniqueName: \"kubernetes.io/projected/571f6589-a451-476a-9066-d348b85a81ac-kube-api-access-l48p7\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029430 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-system-cni-dir\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029450 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-os-release\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.029475 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-run-multus-certs\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.057055 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.057101 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.057118 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.057171 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.057194 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:28Z","lastTransitionTime":"2026-03-19T16:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130076 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/faff5e41-8f94-4bfd-9730-38955ab099d9-rootfs\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130134 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-hostroot\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130170 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-conf-dir\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130209 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzr2k\" (UniqueName: \"kubernetes.io/projected/cf410553-1d0d-4507-9029-77fe224a5d7c-kube-api-access-mzr2k\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130238 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-netns\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130249 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-conf-dir\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130261 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-cnibin\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130282 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86wlv\" (UniqueName: \"kubernetes.io/projected/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-kube-api-access-86wlv\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130208 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/faff5e41-8f94-4bfd-9730-38955ab099d9-rootfs\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130314 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-log-socket\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130337 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-openvswitch\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130359 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-config\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130388 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-os-release\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130413 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-systemd\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130435 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-bin\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130458 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l48p7\" (UniqueName: \"kubernetes.io/projected/571f6589-a451-476a-9066-d348b85a81ac-kube-api-access-l48p7\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130479 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-system-cni-dir\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130502 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-os-release\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130557 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-run-multus-certs\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130590 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-cnibin\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130611 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-var-lib-cni-bin\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130634 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/faff5e41-8f94-4bfd-9730-38955ab099d9-proxy-tls\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130642 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-cnibin\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130271 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-hostroot\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130695 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-var-lib-openvswitch\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130655 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-var-lib-openvswitch\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130719 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-log-socket\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130750 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/571f6589-a451-476a-9066-d348b85a81ac-ovn-node-metrics-cert\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130765 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-cnibin\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130779 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/17f72867-9c04-4355-b110-ecd9daabbfce-hosts-file\") pod \"node-resolver-hzfmx\" (UID: \"17f72867-9c04-4355-b110-ecd9daabbfce\") " pod="openshift-dns/node-resolver-hzfmx" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130317 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-netns\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130951 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-system-cni-dir\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.130999 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-run-multus-certs\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131068 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-os-release\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131098 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-bin\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131135 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/17f72867-9c04-4355-b110-ecd9daabbfce-hosts-file\") pod \"node-resolver-hzfmx\" (UID: \"17f72867-9c04-4355-b110-ecd9daabbfce\") " pod="openshift-dns/node-resolver-hzfmx" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131182 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-var-lib-cni-bin\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131243 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-kubelet\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131280 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-netd\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131313 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-cni-binary-copy\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131373 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-systemd\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131635 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-netd\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131691 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-kubelet\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131760 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-os-release\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131786 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-openvswitch\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131870 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-config\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.131931 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-script-lib\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132077 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-slash\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132132 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-ovn\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132159 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb9hs\" (UniqueName: \"kubernetes.io/projected/17f72867-9c04-4355-b110-ecd9daabbfce-kube-api-access-zb9hs\") pod \"node-resolver-hzfmx\" (UID: \"17f72867-9c04-4355-b110-ecd9daabbfce\") " pod="openshift-dns/node-resolver-hzfmx" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132184 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-socket-dir-parent\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132280 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-var-lib-kubelet\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132310 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/faff5e41-8f94-4bfd-9730-38955ab099d9-mcd-auth-proxy-config\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132333 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-systemd-units\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132356 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-etc-openvswitch\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132384 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132474 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132500 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-daemon-config\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132539 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-etc-kubernetes\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132567 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr2sw\" (UniqueName: \"kubernetes.io/projected/faff5e41-8f94-4bfd-9730-38955ab099d9-kube-api-access-pr2sw\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132589 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-cni-dir\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132616 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-run-k8s-cni-cncf-io\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132638 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-system-cni-dir\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132744 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-cni-binary-copy\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132747 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf410553-1d0d-4507-9029-77fe224a5d7c-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132770 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-script-lib\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132798 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132819 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf410553-1d0d-4507-9029-77fe224a5d7c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132840 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-node-log\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132859 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-var-lib-cni-multus\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132878 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-env-overrides\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132895 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-run-netns\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.132965 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-run-netns\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133004 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-slash\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133047 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-system-cni-dir\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133083 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133125 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-socket-dir-parent\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133132 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-node-log\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133190 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-var-lib-cni-multus\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133264 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133304 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-ovn\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133388 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-var-lib-kubelet\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133544 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-systemd-units\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133565 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-etc-kubernetes\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133645 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-etc-openvswitch\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133677 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf410553-1d0d-4507-9029-77fe224a5d7c-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133723 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-host-run-k8s-cni-cncf-io\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133884 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-cni-dir\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.133889 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf410553-1d0d-4507-9029-77fe224a5d7c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.134003 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf410553-1d0d-4507-9029-77fe224a5d7c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.134122 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-env-overrides\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.134135 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/faff5e41-8f94-4bfd-9730-38955ab099d9-mcd-auth-proxy-config\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.134261 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-multus-daemon-config\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.138563 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/571f6589-a451-476a-9066-d348b85a81ac-ovn-node-metrics-cert\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.139509 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/faff5e41-8f94-4bfd-9730-38955ab099d9-proxy-tls\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.155389 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l48p7\" (UniqueName: \"kubernetes.io/projected/571f6589-a451-476a-9066-d348b85a81ac-kube-api-access-l48p7\") pod \"ovnkube-node-g7pf8\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.156656 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-sqq87"] Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.157215 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sqq87" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.161029 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.161087 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86wlv\" (UniqueName: \"kubernetes.io/projected/c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3-kube-api-access-86wlv\") pod \"multus-m2sxj\" (UID: \"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3\") " pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.161307 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.161335 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.161505 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.161978 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr2sw\" (UniqueName: \"kubernetes.io/projected/faff5e41-8f94-4bfd-9730-38955ab099d9-kube-api-access-pr2sw\") pod \"machine-config-daemon-d4bjv\" (UID: \"faff5e41-8f94-4bfd-9730-38955ab099d9\") " pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.162346 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.162383 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.162394 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.162418 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.162433 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:28Z","lastTransitionTime":"2026-03-19T16:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.167412 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzr2k\" (UniqueName: \"kubernetes.io/projected/cf410553-1d0d-4507-9029-77fe224a5d7c-kube-api-access-mzr2k\") pod \"multus-additional-cni-plugins-nrh6c\" (UID: \"cf410553-1d0d-4507-9029-77fe224a5d7c\") " pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.170275 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb9hs\" (UniqueName: \"kubernetes.io/projected/17f72867-9c04-4355-b110-ecd9daabbfce-kube-api-access-zb9hs\") pod \"node-resolver-hzfmx\" (UID: \"17f72867-9c04-4355-b110-ecd9daabbfce\") " pod="openshift-dns/node-resolver-hzfmx" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.191463 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hzfmx" Mar 19 16:41:28 crc kubenswrapper[4918]: W0319 16:41:28.210180 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f72867_9c04_4355_b110_ecd9daabbfce.slice/crio-bd21c8aa0eb9486ce7aeffc85bdaa717c55bdb36b9d3ccb164a6b0ab93ee517f WatchSource:0}: Error finding container bd21c8aa0eb9486ce7aeffc85bdaa717c55bdb36b9d3ccb164a6b0ab93ee517f: Status 404 returned error can't find the container with id bd21c8aa0eb9486ce7aeffc85bdaa717c55bdb36b9d3ccb164a6b0ab93ee517f Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.210886 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:41:28 crc kubenswrapper[4918]: W0319 16:41:28.223699 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaff5e41_8f94_4bfd_9730_38955ab099d9.slice/crio-b8b041cd3fde2c3378ddda1b1f887929b3dc7dfd1241a56433cd39aa4d76d70b WatchSource:0}: Error finding container b8b041cd3fde2c3378ddda1b1f887929b3dc7dfd1241a56433cd39aa4d76d70b: Status 404 returned error can't find the container with id b8b041cd3fde2c3378ddda1b1f887929b3dc7dfd1241a56433cd39aa4d76d70b Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.226562 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nrh6c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.233357 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.233495 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.233582 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:32.233551097 +0000 UTC m=+104.355750345 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.233639 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.233651 4918 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.233713 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:32.233692051 +0000 UTC m=+104.355891309 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.233785 4918 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.233828 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:32.233821385 +0000 UTC m=+104.356020633 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.239153 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m2sxj" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.244903 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.266812 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.266866 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.266879 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.266901 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.266914 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:28Z","lastTransitionTime":"2026-03-19T16:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.334653 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49wkn\" (UniqueName: \"kubernetes.io/projected/c8b011e2-56c6-4aa2-86aa-081797d48a18-kube-api-access-49wkn\") pod \"node-ca-sqq87\" (UID: \"c8b011e2-56c6-4aa2-86aa-081797d48a18\") " pod="openshift-image-registry/node-ca-sqq87" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.334733 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.334769 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c8b011e2-56c6-4aa2-86aa-081797d48a18-serviceca\") pod \"node-ca-sqq87\" (UID: \"c8b011e2-56c6-4aa2-86aa-081797d48a18\") " pod="openshift-image-registry/node-ca-sqq87" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.334801 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8b011e2-56c6-4aa2-86aa-081797d48a18-host\") pod \"node-ca-sqq87\" (UID: \"c8b011e2-56c6-4aa2-86aa-081797d48a18\") " pod="openshift-image-registry/node-ca-sqq87" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.334832 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.335009 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.338185 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.338216 4918 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.338337 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:32.338305787 +0000 UTC m=+104.460505035 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.339083 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.339100 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.339111 4918 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.339155 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:32.339143062 +0000 UTC m=+104.461342310 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.342121 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt"] Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.343107 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.349593 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.349657 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.364819 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-qcgd2"] Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.369404 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.369499 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcgd2" podUID="69770981-c309-4aa4-ba5a-29bf78372aae" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.370596 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.370634 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.370644 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.370664 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.370674 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:28Z","lastTransitionTime":"2026-03-19T16:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.435349 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49wkn\" (UniqueName: \"kubernetes.io/projected/c8b011e2-56c6-4aa2-86aa-081797d48a18-kube-api-access-49wkn\") pod \"node-ca-sqq87\" (UID: \"c8b011e2-56c6-4aa2-86aa-081797d48a18\") " pod="openshift-image-registry/node-ca-sqq87" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.435415 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c8b011e2-56c6-4aa2-86aa-081797d48a18-serviceca\") pod \"node-ca-sqq87\" (UID: \"c8b011e2-56c6-4aa2-86aa-081797d48a18\") " pod="openshift-image-registry/node-ca-sqq87" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.435439 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8b011e2-56c6-4aa2-86aa-081797d48a18-host\") pod \"node-ca-sqq87\" (UID: \"c8b011e2-56c6-4aa2-86aa-081797d48a18\") " pod="openshift-image-registry/node-ca-sqq87" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.435510 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8b011e2-56c6-4aa2-86aa-081797d48a18-host\") pod \"node-ca-sqq87\" (UID: \"c8b011e2-56c6-4aa2-86aa-081797d48a18\") " pod="openshift-image-registry/node-ca-sqq87" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.437036 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c8b011e2-56c6-4aa2-86aa-081797d48a18-serviceca\") pod \"node-ca-sqq87\" (UID: \"c8b011e2-56c6-4aa2-86aa-081797d48a18\") " pod="openshift-image-registry/node-ca-sqq87" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.458989 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49wkn\" (UniqueName: \"kubernetes.io/projected/c8b011e2-56c6-4aa2-86aa-081797d48a18-kube-api-access-49wkn\") pod \"node-ca-sqq87\" (UID: \"c8b011e2-56c6-4aa2-86aa-081797d48a18\") " pod="openshift-image-registry/node-ca-sqq87" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.474410 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.474456 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.474467 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.474486 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.474498 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:28Z","lastTransitionTime":"2026-03-19T16:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.483324 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sqq87" Mar 19 16:41:28 crc kubenswrapper[4918]: W0319 16:41:28.503611 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b011e2_56c6_4aa2_86aa_081797d48a18.slice/crio-becb92819701edf29bc99c9cfb45d8667ce680d4a4a47e50cce1f3c4a6a01e1d WatchSource:0}: Error finding container becb92819701edf29bc99c9cfb45d8667ce680d4a4a47e50cce1f3c4a6a01e1d: Status 404 returned error can't find the container with id becb92819701edf29bc99c9cfb45d8667ce680d4a4a47e50cce1f3c4a6a01e1d Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.536696 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.536790 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.536830 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.536867 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs\") pod \"network-metrics-daemon-qcgd2\" (UID: \"69770981-c309-4aa4-ba5a-29bf78372aae\") " pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.536915 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p9jz\" (UniqueName: \"kubernetes.io/projected/69770981-c309-4aa4-ba5a-29bf78372aae-kube-api-access-6p9jz\") pod \"network-metrics-daemon-qcgd2\" (UID: \"69770981-c309-4aa4-ba5a-29bf78372aae\") " pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.536955 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7cct\" (UniqueName: \"kubernetes.io/projected/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-kube-api-access-w7cct\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.576872 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.576943 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.576961 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.576996 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.577016 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:28Z","lastTransitionTime":"2026-03-19T16:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.586299 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.586301 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.588184 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.588407 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.638448 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.638504 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.638573 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs\") pod \"network-metrics-daemon-qcgd2\" (UID: \"69770981-c309-4aa4-ba5a-29bf78372aae\") " pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.638598 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p9jz\" (UniqueName: \"kubernetes.io/projected/69770981-c309-4aa4-ba5a-29bf78372aae-kube-api-access-6p9jz\") pod \"network-metrics-daemon-qcgd2\" (UID: \"69770981-c309-4aa4-ba5a-29bf78372aae\") " pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.638617 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7cct\" (UniqueName: \"kubernetes.io/projected/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-kube-api-access-w7cct\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.638686 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.639021 4918 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 16:41:28 crc kubenswrapper[4918]: E0319 16:41:28.639163 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs podName:69770981-c309-4aa4-ba5a-29bf78372aae nodeName:}" failed. No retries permitted until 2026-03-19 16:41:29.139135271 +0000 UTC m=+101.261334549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs") pod "network-metrics-daemon-qcgd2" (UID: "69770981-c309-4aa4-ba5a-29bf78372aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.639563 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.639823 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.642873 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.659469 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7cct\" (UniqueName: \"kubernetes.io/projected/c335ae4a-610e-42b7-bea8-1b2d9aba67ea-kube-api-access-w7cct\") pod \"ovnkube-control-plane-749d76644c-m56xt\" (UID: \"c335ae4a-610e-42b7-bea8-1b2d9aba67ea\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.659697 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p9jz\" (UniqueName: \"kubernetes.io/projected/69770981-c309-4aa4-ba5a-29bf78372aae-kube-api-access-6p9jz\") pod \"network-metrics-daemon-qcgd2\" (UID: \"69770981-c309-4aa4-ba5a-29bf78372aae\") " pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.677284 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.681001 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.681057 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.681070 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.681087 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.681097 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:28Z","lastTransitionTime":"2026-03-19T16:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.789924 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.789959 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.789967 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.789983 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.789992 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:28Z","lastTransitionTime":"2026-03-19T16:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.892628 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.892692 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.892706 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.892729 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.892747 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:28Z","lastTransitionTime":"2026-03-19T16:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.995323 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.995395 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.995417 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.995445 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:28 crc kubenswrapper[4918]: I0319 16:41:28.995464 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:28Z","lastTransitionTime":"2026-03-19T16:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.065207 4918 generic.go:334] "Generic (PLEG): container finished" podID="571f6589-a451-476a-9066-d348b85a81ac" containerID="174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f" exitCode=0 Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.065296 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerDied","Data":"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.065353 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerStarted","Data":"92b50162d8ee8d283018f0ba26228837511a2a9caa6269acd14bd7dd2f97120c"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.067498 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"573639c2f8c8c02a55123d73dc1535d9da068bd942c36d853ef06043d886bc20"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.072257 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sqq87" event={"ID":"c8b011e2-56c6-4aa2-86aa-081797d48a18","Type":"ContainerStarted","Data":"f97edcc50ffed9670525369420c495c8ad5867eb1487576be62f0e1abcd9f1fb"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.072298 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sqq87" event={"ID":"c8b011e2-56c6-4aa2-86aa-081797d48a18","Type":"ContainerStarted","Data":"becb92819701edf29bc99c9cfb45d8667ce680d4a4a47e50cce1f3c4a6a01e1d"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.076015 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"82ffd55ce317a6e48035a6c09e32f50b278973afac43b3cddb76ecc77a7f8d5f"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.076058 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"4d840761dfd614dd8a3c1473b1242185e860c0c959af7cb2cf7f9c58ed3dceb0"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.076074 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"b8b041cd3fde2c3378ddda1b1f887929b3dc7dfd1241a56433cd39aa4d76d70b"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.078117 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hzfmx" event={"ID":"17f72867-9c04-4355-b110-ecd9daabbfce","Type":"ContainerStarted","Data":"c3f9a4d401448665ebe161db0da7acce7d153d750c5874c95c33cbd226712acb"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.078182 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hzfmx" event={"ID":"17f72867-9c04-4355-b110-ecd9daabbfce","Type":"ContainerStarted","Data":"bd21c8aa0eb9486ce7aeffc85bdaa717c55bdb36b9d3ccb164a6b0ab93ee517f"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.080189 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" event={"ID":"c335ae4a-610e-42b7-bea8-1b2d9aba67ea","Type":"ContainerStarted","Data":"4817eb9f349ce548f621d1738f8f274236ac1188770a155514ce25e3833ede20"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.080245 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" event={"ID":"c335ae4a-610e-42b7-bea8-1b2d9aba67ea","Type":"ContainerStarted","Data":"b3f739dbf51affaa88f5cad5922bacc9d00c6a34e00e58475b3254f21097df27"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.080262 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" event={"ID":"c335ae4a-610e-42b7-bea8-1b2d9aba67ea","Type":"ContainerStarted","Data":"1a34a57c0e40073a64e51ea202b32c9ae6967372f83a2e0009ea0cb2cb91a05f"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.081497 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m2sxj" event={"ID":"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3","Type":"ContainerStarted","Data":"782db47f818546c7d6da4119af00b714e772fafbd3679b80801b98111cab00b0"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.081554 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m2sxj" event={"ID":"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3","Type":"ContainerStarted","Data":"3620c7172857a91431bba78dced98dbc57ec403285196322fb0d6f24081a4ed1"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.083111 4918 generic.go:334] "Generic (PLEG): container finished" podID="cf410553-1d0d-4507-9029-77fe224a5d7c" containerID="6f5518b9e12a524d85eeeaa1778d2a2df3daab5101548149f5f813cbff6aa983" exitCode=0 Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.083168 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrh6c" event={"ID":"cf410553-1d0d-4507-9029-77fe224a5d7c","Type":"ContainerDied","Data":"6f5518b9e12a524d85eeeaa1778d2a2df3daab5101548149f5f813cbff6aa983"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.083255 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrh6c" event={"ID":"cf410553-1d0d-4507-9029-77fe224a5d7c","Type":"ContainerStarted","Data":"67e0a82d4ba70444085e1fc5d23f6ff07ace9a76989d91ad7d6a74378f07836f"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.106335 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.106369 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.106379 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.106392 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.106401 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:29Z","lastTransitionTime":"2026-03-19T16:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.128214 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hzfmx" podStartSLOduration=33.128186286 podStartE2EDuration="33.128186286s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:29.127101604 +0000 UTC m=+101.249300862" watchObservedRunningTime="2026-03-19 16:41:29.128186286 +0000 UTC m=+101.250385534" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.150938 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs\") pod \"network-metrics-daemon-qcgd2\" (UID: \"69770981-c309-4aa4-ba5a-29bf78372aae\") " pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:29 crc kubenswrapper[4918]: E0319 16:41:29.151067 4918 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 16:41:29 crc kubenswrapper[4918]: E0319 16:41:29.151116 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs podName:69770981-c309-4aa4-ba5a-29bf78372aae nodeName:}" failed. No retries permitted until 2026-03-19 16:41:30.151099994 +0000 UTC m=+102.273299242 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs") pod "network-metrics-daemon-qcgd2" (UID: "69770981-c309-4aa4-ba5a-29bf78372aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.184679 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m56xt" podStartSLOduration=32.184657407 podStartE2EDuration="32.184657407s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:29.166432808 +0000 UTC m=+101.288632056" watchObservedRunningTime="2026-03-19 16:41:29.184657407 +0000 UTC m=+101.306856655" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.206325 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sqq87" podStartSLOduration=33.206305007 podStartE2EDuration="33.206305007s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:29.206111642 +0000 UTC m=+101.328310910" watchObservedRunningTime="2026-03-19 16:41:29.206305007 +0000 UTC m=+101.328504255" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.206735 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-m2sxj" podStartSLOduration=33.206729801 podStartE2EDuration="33.206729801s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:29.186717358 +0000 UTC m=+101.308916616" watchObservedRunningTime="2026-03-19 16:41:29.206729801 +0000 UTC m=+101.328929049" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.209192 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.209263 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.209275 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.209298 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.209309 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:29Z","lastTransitionTime":"2026-03-19T16:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.314130 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.314178 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.314189 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.314209 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.314221 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:29Z","lastTransitionTime":"2026-03-19T16:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.418067 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.419882 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.419984 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.420072 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.420156 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:29Z","lastTransitionTime":"2026-03-19T16:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.524146 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.524204 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.524216 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.524235 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.524248 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:29Z","lastTransitionTime":"2026-03-19T16:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.586252 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:29 crc kubenswrapper[4918]: E0319 16:41:29.586932 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.627840 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.627898 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.627910 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.627932 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.627947 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:29Z","lastTransitionTime":"2026-03-19T16:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.731461 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.732465 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.732637 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.732783 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.732931 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:29Z","lastTransitionTime":"2026-03-19T16:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.835619 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.835688 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.835705 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.835735 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.835749 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:29Z","lastTransitionTime":"2026-03-19T16:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.938414 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.938471 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.938482 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.938501 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:29 crc kubenswrapper[4918]: I0319 16:41:29.938513 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:29Z","lastTransitionTime":"2026-03-19T16:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.043082 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.043153 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.043166 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.043186 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.043200 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:30Z","lastTransitionTime":"2026-03-19T16:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.089411 4918 generic.go:334] "Generic (PLEG): container finished" podID="cf410553-1d0d-4507-9029-77fe224a5d7c" containerID="c81b2100dead5772bd3decbe5c701ae16cb5197fbdcdf52812beb7967864ed01" exitCode=0 Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.089498 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrh6c" event={"ID":"cf410553-1d0d-4507-9029-77fe224a5d7c","Type":"ContainerDied","Data":"c81b2100dead5772bd3decbe5c701ae16cb5197fbdcdf52812beb7967864ed01"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.100050 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerStarted","Data":"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.100429 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerStarted","Data":"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.100558 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerStarted","Data":"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.100651 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerStarted","Data":"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.100745 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerStarted","Data":"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.115465 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podStartSLOduration=34.115435616 podStartE2EDuration="34.115435616s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:29.239700646 +0000 UTC m=+101.361899904" watchObservedRunningTime="2026-03-19 16:41:30.115435616 +0000 UTC m=+102.237634874" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.147241 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.147843 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.147933 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.148034 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.148127 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:30Z","lastTransitionTime":"2026-03-19T16:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.160636 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs\") pod \"network-metrics-daemon-qcgd2\" (UID: \"69770981-c309-4aa4-ba5a-29bf78372aae\") " pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:30 crc kubenswrapper[4918]: E0319 16:41:30.161150 4918 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 16:41:30 crc kubenswrapper[4918]: E0319 16:41:30.161262 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs podName:69770981-c309-4aa4-ba5a-29bf78372aae nodeName:}" failed. No retries permitted until 2026-03-19 16:41:32.161238851 +0000 UTC m=+104.283438109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs") pod "network-metrics-daemon-qcgd2" (UID: "69770981-c309-4aa4-ba5a-29bf78372aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.252206 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.252256 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.252269 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.252288 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.252300 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:30Z","lastTransitionTime":"2026-03-19T16:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.355181 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.355237 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.355250 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.355279 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.355292 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:30Z","lastTransitionTime":"2026-03-19T16:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.458450 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.458508 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.458539 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.458559 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.458575 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:30Z","lastTransitionTime":"2026-03-19T16:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.562104 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.562183 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.562202 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.562234 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.562255 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:30Z","lastTransitionTime":"2026-03-19T16:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.586632 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:30 crc kubenswrapper[4918]: E0319 16:41:30.586792 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.586640 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.587234 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:30 crc kubenswrapper[4918]: E0319 16:41:30.587466 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcgd2" podUID="69770981-c309-4aa4-ba5a-29bf78372aae" Mar 19 16:41:30 crc kubenswrapper[4918]: E0319 16:41:30.587803 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.665591 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.665663 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.665687 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.665716 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.665738 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:30Z","lastTransitionTime":"2026-03-19T16:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.768508 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.768614 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.768637 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.768669 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.768691 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:30Z","lastTransitionTime":"2026-03-19T16:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.872049 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.872102 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.872111 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.872129 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.872139 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:30Z","lastTransitionTime":"2026-03-19T16:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.975096 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.975174 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.975207 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.975250 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:30 crc kubenswrapper[4918]: I0319 16:41:30.975270 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:30Z","lastTransitionTime":"2026-03-19T16:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.078978 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.079051 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.079078 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.079110 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.079133 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:31Z","lastTransitionTime":"2026-03-19T16:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.108387 4918 generic.go:334] "Generic (PLEG): container finished" podID="cf410553-1d0d-4507-9029-77fe224a5d7c" containerID="f4ae555e7480232112df3ce3e466edfd891336d75a5f3dc7b68956b8429c28e2" exitCode=0 Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.108562 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrh6c" event={"ID":"cf410553-1d0d-4507-9029-77fe224a5d7c","Type":"ContainerDied","Data":"f4ae555e7480232112df3ce3e466edfd891336d75a5f3dc7b68956b8429c28e2"} Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.120817 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerStarted","Data":"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc"} Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.181613 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.181966 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.182042 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.182126 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.182214 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:31Z","lastTransitionTime":"2026-03-19T16:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.286470 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.286541 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.286553 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.286573 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.286585 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:31Z","lastTransitionTime":"2026-03-19T16:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.392026 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.392083 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.392098 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.392122 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.392136 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:31Z","lastTransitionTime":"2026-03-19T16:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.494735 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.495247 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.495260 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.495278 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.495290 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:31Z","lastTransitionTime":"2026-03-19T16:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.585750 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:31 crc kubenswrapper[4918]: E0319 16:41:31.585926 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.598175 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.598216 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.598225 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.598239 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.598248 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:31Z","lastTransitionTime":"2026-03-19T16:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.700761 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.700825 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.700843 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.700866 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.700883 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:31Z","lastTransitionTime":"2026-03-19T16:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.803708 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.803787 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.803807 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.803833 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.803852 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:31Z","lastTransitionTime":"2026-03-19T16:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.906622 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.906688 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.906708 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.906733 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:31 crc kubenswrapper[4918]: I0319 16:41:31.906752 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:31Z","lastTransitionTime":"2026-03-19T16:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.009906 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.010333 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.010410 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.010496 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.010616 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:32Z","lastTransitionTime":"2026-03-19T16:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.113298 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.113641 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.113713 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.113789 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.113853 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:32Z","lastTransitionTime":"2026-03-19T16:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.127978 4918 generic.go:334] "Generic (PLEG): container finished" podID="cf410553-1d0d-4507-9029-77fe224a5d7c" containerID="946969e148e2acbc0227f056019aec9335d80f41ad6c7f9ae405f0639e6c17e5" exitCode=0 Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.128063 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrh6c" event={"ID":"cf410553-1d0d-4507-9029-77fe224a5d7c","Type":"ContainerDied","Data":"946969e148e2acbc0227f056019aec9335d80f41ad6c7f9ae405f0639e6c17e5"} Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.185142 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs\") pod \"network-metrics-daemon-qcgd2\" (UID: \"69770981-c309-4aa4-ba5a-29bf78372aae\") " pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.185569 4918 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.185620 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs podName:69770981-c309-4aa4-ba5a-29bf78372aae nodeName:}" failed. No retries permitted until 2026-03-19 16:41:36.185606508 +0000 UTC m=+108.307805756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs") pod "network-metrics-daemon-qcgd2" (UID: "69770981-c309-4aa4-ba5a-29bf78372aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.219245 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.219285 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.219301 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.219322 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.219338 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:32Z","lastTransitionTime":"2026-03-19T16:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.285738 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.285925 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:40.285891587 +0000 UTC m=+112.408090835 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.286028 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.286080 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.286173 4918 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.286224 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:40.286217276 +0000 UTC m=+112.408416524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.286233 4918 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.286314 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:40.286293889 +0000 UTC m=+112.408493147 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.321486 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.321591 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.321610 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.321638 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.321658 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:32Z","lastTransitionTime":"2026-03-19T16:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.387225 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.387382 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.387484 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.387551 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.387573 4918 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.387662 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:40.387635718 +0000 UTC m=+112.509834976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.387663 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.387707 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.387737 4918 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.387837 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:40.387808624 +0000 UTC m=+112.510007902 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.425122 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.425202 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.425220 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.425246 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.425264 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:32Z","lastTransitionTime":"2026-03-19T16:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.528784 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.528889 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.528915 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.528949 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.528974 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:32Z","lastTransitionTime":"2026-03-19T16:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.585802 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.585905 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.585926 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.586386 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcgd2" podUID="69770981-c309-4aa4-ba5a-29bf78372aae" Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.586609 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 16:41:32 crc kubenswrapper[4918]: E0319 16:41:32.586889 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.603477 4918 scope.go:117] "RemoveContainer" containerID="f52a359a1ac292a614b20c79a490412ec1b7e37ecf7dfc7576babdc09dfe0ea2" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.605151 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.632725 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.632790 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.632810 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.632837 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.632856 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:32Z","lastTransitionTime":"2026-03-19T16:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.736705 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.736843 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.736862 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.736888 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.736906 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:32Z","lastTransitionTime":"2026-03-19T16:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.839438 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.839493 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.839511 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.839554 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.839570 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:32Z","lastTransitionTime":"2026-03-19T16:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.943496 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.943578 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.943594 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.943616 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:32 crc kubenswrapper[4918]: I0319 16:41:32.943632 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:32Z","lastTransitionTime":"2026-03-19T16:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.046434 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.046514 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.046577 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.046615 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.046650 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:33Z","lastTransitionTime":"2026-03-19T16:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.134919 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.137660 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.137977 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.145081 4918 generic.go:334] "Generic (PLEG): container finished" podID="cf410553-1d0d-4507-9029-77fe224a5d7c" containerID="06d41b1de2a2caf5e650aece112bd772cd747853f32131884cc629530060b3bf" exitCode=0 Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.145197 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrh6c" event={"ID":"cf410553-1d0d-4507-9029-77fe224a5d7c","Type":"ContainerDied","Data":"06d41b1de2a2caf5e650aece112bd772cd747853f32131884cc629530060b3bf"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.149592 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.149684 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.149702 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.149721 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.149773 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:33Z","lastTransitionTime":"2026-03-19T16:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.151337 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerStarted","Data":"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.219407 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=1.219372535 podStartE2EDuration="1.219372535s" podCreationTimestamp="2026-03-19 16:41:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:33.184886415 +0000 UTC m=+105.307085683" watchObservedRunningTime="2026-03-19 16:41:33.219372535 +0000 UTC m=+105.341571823" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.253098 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.253144 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.253156 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.253173 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.253185 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:33Z","lastTransitionTime":"2026-03-19T16:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.355781 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.355847 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.355865 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.355894 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.355915 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:33Z","lastTransitionTime":"2026-03-19T16:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.458892 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.458970 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.458992 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.459018 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.459039 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:33Z","lastTransitionTime":"2026-03-19T16:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.562995 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.563056 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.563073 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.563097 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.563116 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:33Z","lastTransitionTime":"2026-03-19T16:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.585839 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:33 crc kubenswrapper[4918]: E0319 16:41:33.586054 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.666301 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.666374 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.666390 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.666413 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.666431 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:33Z","lastTransitionTime":"2026-03-19T16:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.770470 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.770561 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.770579 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.770606 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.770630 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:33Z","lastTransitionTime":"2026-03-19T16:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.873868 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.873924 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.873937 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.873957 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.873971 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:33Z","lastTransitionTime":"2026-03-19T16:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.977171 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.977274 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.977287 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.977306 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:33 crc kubenswrapper[4918]: I0319 16:41:33.977318 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:33Z","lastTransitionTime":"2026-03-19T16:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.083911 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.083969 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.083981 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.083999 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.084013 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:34Z","lastTransitionTime":"2026-03-19T16:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.159969 4918 generic.go:334] "Generic (PLEG): container finished" podID="cf410553-1d0d-4507-9029-77fe224a5d7c" containerID="122945f9fd62e12d508b562ca4fe93930219a3d39619824a0394caf38c1db8ab" exitCode=0 Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.160057 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrh6c" event={"ID":"cf410553-1d0d-4507-9029-77fe224a5d7c","Type":"ContainerDied","Data":"122945f9fd62e12d508b562ca4fe93930219a3d39619824a0394caf38c1db8ab"} Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.186690 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.186751 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.186770 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.186793 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.186809 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:34Z","lastTransitionTime":"2026-03-19T16:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.289908 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.289962 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.289974 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.289992 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.290006 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:34Z","lastTransitionTime":"2026-03-19T16:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.393002 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.393084 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.393103 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.393129 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.393150 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:34Z","lastTransitionTime":"2026-03-19T16:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.496404 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.496458 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.496474 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.496494 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.496507 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:34Z","lastTransitionTime":"2026-03-19T16:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.585562 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.585591 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:34 crc kubenswrapper[4918]: E0319 16:41:34.585795 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.585932 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:34 crc kubenswrapper[4918]: E0319 16:41:34.586059 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 16:41:34 crc kubenswrapper[4918]: E0319 16:41:34.586252 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcgd2" podUID="69770981-c309-4aa4-ba5a-29bf78372aae" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.598514 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.598583 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.598597 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.598617 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.598630 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:34Z","lastTransitionTime":"2026-03-19T16:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.701986 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.702057 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.702074 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.702097 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.702116 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:34Z","lastTransitionTime":"2026-03-19T16:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.805933 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.805997 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.806017 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.806044 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.806062 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:34Z","lastTransitionTime":"2026-03-19T16:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.912363 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.912431 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.912452 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.912483 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:34 crc kubenswrapper[4918]: I0319 16:41:34.912506 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:34Z","lastTransitionTime":"2026-03-19T16:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.015714 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.015793 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.015809 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.015826 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.015837 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:35Z","lastTransitionTime":"2026-03-19T16:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.118452 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.118498 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.118508 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.118521 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.118537 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:35Z","lastTransitionTime":"2026-03-19T16:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.197825 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrh6c" event={"ID":"cf410553-1d0d-4507-9029-77fe224a5d7c","Type":"ContainerStarted","Data":"e0f2d0d36187f5c855bc23537606b1ab448176efedbd595d2b3c195693d9a944"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.220136 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerStarted","Data":"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.220765 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.220841 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.220861 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.222739 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.222774 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.222788 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.222806 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.222821 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:35Z","lastTransitionTime":"2026-03-19T16:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.249074 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nrh6c" podStartSLOduration=39.249046939 podStartE2EDuration="39.249046939s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:35.248479003 +0000 UTC m=+107.370678251" watchObservedRunningTime="2026-03-19 16:41:35.249046939 +0000 UTC m=+107.371246207" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.261793 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.263138 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.284231 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" podStartSLOduration=39.28419871 podStartE2EDuration="39.28419871s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:35.283285563 +0000 UTC m=+107.405484821" watchObservedRunningTime="2026-03-19 16:41:35.28419871 +0000 UTC m=+107.406397968" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.324856 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.324908 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.324920 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.324941 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.324953 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:35Z","lastTransitionTime":"2026-03-19T16:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.427359 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.427423 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.427444 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.427468 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.427503 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:35Z","lastTransitionTime":"2026-03-19T16:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.530186 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.530246 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.530262 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.530285 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.530305 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:35Z","lastTransitionTime":"2026-03-19T16:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.585612 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:35 crc kubenswrapper[4918]: E0319 16:41:35.585853 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.633809 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.633878 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.633895 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.633923 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.633942 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:35Z","lastTransitionTime":"2026-03-19T16:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.736791 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.736848 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.736866 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.736890 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.736907 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:35Z","lastTransitionTime":"2026-03-19T16:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.840114 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.840159 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.840173 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.840188 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.840200 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:35Z","lastTransitionTime":"2026-03-19T16:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.942921 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.943013 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.943032 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.943058 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:35 crc kubenswrapper[4918]: I0319 16:41:35.943079 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:35Z","lastTransitionTime":"2026-03-19T16:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.046288 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.046339 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.046348 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.046364 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.046376 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:36Z","lastTransitionTime":"2026-03-19T16:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.149380 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.149450 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.149472 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.149503 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.149578 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:36Z","lastTransitionTime":"2026-03-19T16:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.232865 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs\") pod \"network-metrics-daemon-qcgd2\" (UID: \"69770981-c309-4aa4-ba5a-29bf78372aae\") " pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:36 crc kubenswrapper[4918]: E0319 16:41:36.233126 4918 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 16:41:36 crc kubenswrapper[4918]: E0319 16:41:36.233275 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs podName:69770981-c309-4aa4-ba5a-29bf78372aae nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.23324464 +0000 UTC m=+116.355443928 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs") pod "network-metrics-daemon-qcgd2" (UID: "69770981-c309-4aa4-ba5a-29bf78372aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.252885 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.252985 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.253012 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.253043 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.253070 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:36Z","lastTransitionTime":"2026-03-19T16:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.358060 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.358104 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.358113 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.358129 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.358141 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:36Z","lastTransitionTime":"2026-03-19T16:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.461787 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.461841 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.461852 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.461874 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.461886 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:36Z","lastTransitionTime":"2026-03-19T16:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.564875 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.564936 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.564947 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.564962 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.564973 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:36Z","lastTransitionTime":"2026-03-19T16:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.585821 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.585816 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:36 crc kubenswrapper[4918]: E0319 16:41:36.585987 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 16:41:36 crc kubenswrapper[4918]: E0319 16:41:36.586118 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.585836 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:36 crc kubenswrapper[4918]: E0319 16:41:36.586327 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcgd2" podUID="69770981-c309-4aa4-ba5a-29bf78372aae" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.668507 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.668584 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.668595 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.668612 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.668622 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:36Z","lastTransitionTime":"2026-03-19T16:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.771117 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.771185 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.771198 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.771217 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.771232 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:36Z","lastTransitionTime":"2026-03-19T16:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.872106 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.872163 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.872181 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.872207 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.872228 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:36Z","lastTransitionTime":"2026-03-19T16:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.927949 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.928032 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.928056 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.928088 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.928112 4918 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T16:41:36Z","lastTransitionTime":"2026-03-19T16:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.935397 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qcgd2"] Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.947272 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg"] Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.947945 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.953633 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.956784 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.957008 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 16:41:36 crc kubenswrapper[4918]: I0319 16:41:36.957376 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.042734 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2823e87c-4123-4fd4-8bbc-5df7c843777e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.042785 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2823e87c-4123-4fd4-8bbc-5df7c843777e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.042854 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2823e87c-4123-4fd4-8bbc-5df7c843777e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.042885 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2823e87c-4123-4fd4-8bbc-5df7c843777e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.043088 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2823e87c-4123-4fd4-8bbc-5df7c843777e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.143864 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2823e87c-4123-4fd4-8bbc-5df7c843777e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.143932 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2823e87c-4123-4fd4-8bbc-5df7c843777e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.143958 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2823e87c-4123-4fd4-8bbc-5df7c843777e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.143978 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2823e87c-4123-4fd4-8bbc-5df7c843777e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.144016 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2823e87c-4123-4fd4-8bbc-5df7c843777e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.144010 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2823e87c-4123-4fd4-8bbc-5df7c843777e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.144079 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2823e87c-4123-4fd4-8bbc-5df7c843777e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.145089 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2823e87c-4123-4fd4-8bbc-5df7c843777e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.152417 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2823e87c-4123-4fd4-8bbc-5df7c843777e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.164798 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2823e87c-4123-4fd4-8bbc-5df7c843777e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9cvvg\" (UID: \"2823e87c-4123-4fd4-8bbc-5df7c843777e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.228993 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:37 crc kubenswrapper[4918]: E0319 16:41:37.229216 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcgd2" podUID="69770981-c309-4aa4-ba5a-29bf78372aae" Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.269411 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" Mar 19 16:41:37 crc kubenswrapper[4918]: W0319 16:41:37.290578 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2823e87c_4123_4fd4_8bbc_5df7c843777e.slice/crio-63bb6dbf1466f77453a37f4d237c42c3ecdf3ac1736230d931ed0005292c7308 WatchSource:0}: Error finding container 63bb6dbf1466f77453a37f4d237c42c3ecdf3ac1736230d931ed0005292c7308: Status 404 returned error can't find the container with id 63bb6dbf1466f77453a37f4d237c42c3ecdf3ac1736230d931ed0005292c7308 Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.572458 4918 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.584854 4918 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 16:41:37 crc kubenswrapper[4918]: I0319 16:41:37.585915 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:37 crc kubenswrapper[4918]: E0319 16:41:37.586053 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 16:41:38 crc kubenswrapper[4918]: I0319 16:41:38.236216 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" event={"ID":"2823e87c-4123-4fd4-8bbc-5df7c843777e","Type":"ContainerStarted","Data":"f2b1e3404bfb54f119eb497fad90bf08a56064ce860089a3d09bdcc107d9e5f7"} Mar 19 16:41:38 crc kubenswrapper[4918]: I0319 16:41:38.236277 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" event={"ID":"2823e87c-4123-4fd4-8bbc-5df7c843777e","Type":"ContainerStarted","Data":"63bb6dbf1466f77453a37f4d237c42c3ecdf3ac1736230d931ed0005292c7308"} Mar 19 16:41:38 crc kubenswrapper[4918]: I0319 16:41:38.585942 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:38 crc kubenswrapper[4918]: E0319 16:41:38.587219 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 16:41:38 crc kubenswrapper[4918]: I0319 16:41:38.587364 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:38 crc kubenswrapper[4918]: I0319 16:41:38.587387 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:38 crc kubenswrapper[4918]: E0319 16:41:38.587513 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcgd2" podUID="69770981-c309-4aa4-ba5a-29bf78372aae" Mar 19 16:41:38 crc kubenswrapper[4918]: E0319 16:41:38.587886 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.585638 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:39 crc kubenswrapper[4918]: E0319 16:41:39.585874 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.617892 4918 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.618166 4918 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.662079 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9cvvg" podStartSLOduration=43.662057655 podStartE2EDuration="43.662057655s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:38.263142061 +0000 UTC m=+110.385341349" watchObservedRunningTime="2026-03-19 16:41:39.662057655 +0000 UTC m=+111.784256903" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.662243 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h9xcq"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.662644 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.665942 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84tp9"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.666161 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gbfq6"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.666431 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.666751 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.668252 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.668428 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.668608 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.668719 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.670485 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.670552 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.670737 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.671506 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.673014 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.673600 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.673773 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.673942 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.673179 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.675365 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bvkm6"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.675945 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.676688 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.677045 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.677129 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.677192 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.677237 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.677300 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.678436 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.682382 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.682724 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.683049 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.683769 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.700424 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.700566 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.700863 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.701195 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.701394 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.701554 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.701859 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.719398 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.724161 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mvf4s"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.725168 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.733384 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-pz4gm"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.733944 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.734324 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.734834 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-pz4gm" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.740164 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.745787 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.746377 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8shmz"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.746671 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.746675 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.746849 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.749583 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-td7k5"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.750165 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mnwrv"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.750357 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.750911 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.751116 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.751590 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.751672 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.782734 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.783649 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.783728 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pz8dd"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.783910 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.783950 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.784347 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.784678 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.784804 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.785011 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.785910 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.786054 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5c6nk"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.786136 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.789312 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.790358 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.790705 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.790781 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.791399 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.791769 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.792719 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.802477 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.802879 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.803010 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.803145 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.803196 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.805222 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.803232 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.805140 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.803266 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.803312 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.803375 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.803427 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.803497 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.803921 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.803990 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.806910 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.806823 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.804049 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.804200 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.804303 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.804409 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.804533 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.804667 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.804672 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.805582 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.808377 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-khrx9"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.811497 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h9xcq"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812602 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84tp9"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812708 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.811798 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.811857 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.813684 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-pz4gm"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.816435 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gbfq6"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.816646 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-4rm5n"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812013 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.814678 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.817734 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812094 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.826270 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812148 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.824268 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812209 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812284 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812350 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812442 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812469 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812553 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812565 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812597 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812634 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812651 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812700 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812715 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812768 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812833 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.812894 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.827172 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.813173 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.827301 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.818308 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.818520 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.818623 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.818624 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.818700 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.818749 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.820658 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.828257 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.830666 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.832893 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.833328 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.833483 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.833740 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.833748 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.833777 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.834212 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.834629 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.839634 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.842713 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.843223 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.846561 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.846863 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.848017 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.848091 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.855142 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mvf4s"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.855295 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.860621 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.861668 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.863955 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.867827 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.869261 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.870113 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.870967 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.871831 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.872931 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.873173 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.875792 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.878557 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k4qc7"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.879281 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.879507 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.880189 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-w2fmz"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.880317 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.881147 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.881167 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mnwrv"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.882260 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-td7k5"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883098 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883513 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqkbl\" (UniqueName: \"kubernetes.io/projected/4f9fa2d8-df01-417a-9170-d1e8288e4111-kube-api-access-hqkbl\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883583 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-audit\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883603 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rj8j\" (UniqueName: \"kubernetes.io/projected/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-kube-api-access-5rj8j\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883629 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml4xf\" (UniqueName: \"kubernetes.io/projected/3d68da4e-0dc1-4835-99f3-a5703db9288e-kube-api-access-ml4xf\") pod \"downloads-7954f5f757-pz4gm\" (UID: \"3d68da4e-0dc1-4835-99f3-a5703db9288e\") " pod="openshift-console/downloads-7954f5f757-pz4gm" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883647 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e4c597-7103-43b5-a54b-4a0cf131a749-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9fk82\" (UID: \"22e4c597-7103-43b5-a54b-4a0cf131a749\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883665 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxjhs\" (UniqueName: \"kubernetes.io/projected/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-kube-api-access-fxjhs\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883683 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-config\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883701 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2bedb86-27a0-40a4-a97c-10f1f287fc00-serving-cert\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883717 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f2bedb86-27a0-40a4-a97c-10f1f287fc00-encryption-config\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883740 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvnlv\" (UniqueName: \"kubernetes.io/projected/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-kube-api-access-bvnlv\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883763 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-trusted-ca-bundle\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883783 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f9fa2d8-df01-417a-9170-d1e8288e4111-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883804 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-etcd-service-ca\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883822 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-audit-dir\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883839 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-images\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883854 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-client-ca\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883873 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2bedb86-27a0-40a4-a97c-10f1f287fc00-audit-policies\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883892 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-config\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883913 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883930 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-etcd-ca\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883947 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-etcd-client\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883964 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f2bedb86-27a0-40a4-a97c-10f1f287fc00-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.883982 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884003 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884027 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884046 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-policies\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884074 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-serving-cert\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884150 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftkc8\" (UniqueName: \"kubernetes.io/projected/26c329d7-2138-4f3b-81cd-4b8c0a595a27-kube-api-access-ftkc8\") pod \"dns-operator-744455d44c-mvf4s\" (UID: \"26c329d7-2138-4f3b-81cd-4b8c0a595a27\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884249 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2fpr\" (UniqueName: \"kubernetes.io/projected/5189c318-e4b1-4dd9-9a6d-284425d319cf-kube-api-access-q2fpr\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884321 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884347 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-node-pullsecrets\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884385 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trkkh\" (UniqueName: \"kubernetes.io/projected/42099723-6874-4d2d-a1ee-e7fd8db3f66c-kube-api-access-trkkh\") pod \"openshift-config-operator-7777fb866f-8shmz\" (UID: \"42099723-6874-4d2d-a1ee-e7fd8db3f66c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884422 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884451 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884472 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7fj9\" (UniqueName: \"kubernetes.io/projected/ef184530-a1ee-415c-a683-0588bf7f3ffb-kube-api-access-r7fj9\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884503 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8md7h\" (UniqueName: \"kubernetes.io/projected/117e7cf2-e68f-423d-a312-f1d63c3b815b-kube-api-access-8md7h\") pod \"cluster-samples-operator-665b6dd947-wbl8z\" (UID: \"117e7cf2-e68f-423d-a312-f1d63c3b815b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884546 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884566 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-serving-cert\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884573 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884591 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884616 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884634 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-config\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884650 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-service-ca\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884673 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2bedb86-27a0-40a4-a97c-10f1f287fc00-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884693 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9fa2d8-df01-417a-9170-d1e8288e4111-config\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884718 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2bedb86-27a0-40a4-a97c-10f1f287fc00-audit-dir\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884760 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42099723-6874-4d2d-a1ee-e7fd8db3f66c-serving-cert\") pod \"openshift-config-operator-7777fb866f-8shmz\" (UID: \"42099723-6874-4d2d-a1ee-e7fd8db3f66c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884826 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef184530-a1ee-415c-a683-0588bf7f3ffb-serving-cert\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884881 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-config\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884908 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.884935 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-etcd-client\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.885381 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.885764 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.886146 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.886234 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.886661 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbxl\" (UniqueName: \"kubernetes.io/projected/f2bedb86-27a0-40a4-a97c-10f1f287fc00-kube-api-access-8vbxl\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.886726 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/117e7cf2-e68f-423d-a312-f1d63c3b815b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wbl8z\" (UID: \"117e7cf2-e68f-423d-a312-f1d63c3b815b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.886760 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.886988 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-serving-cert\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887041 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f9fa2d8-df01-417a-9170-d1e8288e4111-service-ca-bundle\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887082 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-image-import-ca\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887114 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/42099723-6874-4d2d-a1ee-e7fd8db3f66c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8shmz\" (UID: \"42099723-6874-4d2d-a1ee-e7fd8db3f66c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887153 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887205 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-92pms"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887277 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-dir\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887331 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887400 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2bedb86-27a0-40a4-a97c-10f1f287fc00-etcd-client\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887446 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9fa2d8-df01-417a-9170-d1e8288e4111-serving-cert\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887557 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887595 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bh8d\" (UniqueName: \"kubernetes.io/projected/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-kube-api-access-7bh8d\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887635 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-etcd-serving-ca\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887670 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-config\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887702 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-oauth-config\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887732 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-oauth-serving-cert\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887791 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887841 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887883 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp7zf\" (UniqueName: \"kubernetes.io/projected/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-kube-api-access-hp7zf\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887914 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-encryption-config\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887942 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fcbfm"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.887972 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26c329d7-2138-4f3b-81cd-4b8c0a595a27-metrics-tls\") pod \"dns-operator-744455d44c-mvf4s\" (UID: \"26c329d7-2138-4f3b-81cd-4b8c0a595a27\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.888012 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e4c597-7103-43b5-a54b-4a0cf131a749-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9fk82\" (UID: \"22e4c597-7103-43b5-a54b-4a0cf131a749\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.888054 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnjwz\" (UniqueName: \"kubernetes.io/projected/22e4c597-7103-43b5-a54b-4a0cf131a749-kube-api-access-cnjwz\") pod \"openshift-controller-manager-operator-756b6f6bc6-9fk82\" (UID: \"22e4c597-7103-43b5-a54b-4a0cf131a749\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.888269 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.888425 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fcbfm" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.888710 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.889723 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-clcn2"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.890460 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.890474 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.891633 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-khrx9"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.892284 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bvkm6"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.896795 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.896830 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5c6nk"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.896841 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.896854 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.897256 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.898913 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.899778 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.904503 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.905179 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.907698 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.909406 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pz8dd"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.911007 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.916408 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.918117 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8shmz"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.920083 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.920736 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.922881 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.922947 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-92pms"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.924623 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.932954 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.935280 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.937592 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-clcn2"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.940530 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.941971 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-nfhcd"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.942743 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nfhcd" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.943394 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fcbfm"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.944689 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.944911 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jl5dc"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.950614 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.950770 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.953117 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-w2fmz"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.957362 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k4qc7"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.961143 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jl5dc"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.962173 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.963681 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-cvm4l"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.964806 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.965202 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jb6n7"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.965655 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.966412 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.966922 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jb6n7"] Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.985673 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.988819 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef184530-a1ee-415c-a683-0588bf7f3ffb-serving-cert\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.988891 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-apiservice-cert\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.988927 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0a8c78b-296b-4baa-91f2-bb9696fdc134-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hkhgj\" (UID: \"f0a8c78b-296b-4baa-91f2-bb9696fdc134\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.988989 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989041 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-etcd-client\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989075 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-config\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989135 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbxl\" (UniqueName: \"kubernetes.io/projected/f2bedb86-27a0-40a4-a97c-10f1f287fc00-kube-api-access-8vbxl\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989169 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/117e7cf2-e68f-423d-a312-f1d63c3b815b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wbl8z\" (UID: \"117e7cf2-e68f-423d-a312-f1d63c3b815b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989225 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989259 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-serving-cert\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989329 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-tmpfs\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989387 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f9fa2d8-df01-417a-9170-d1e8288e4111-service-ca-bundle\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989480 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-image-import-ca\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989513 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/42099723-6874-4d2d-a1ee-e7fd8db3f66c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8shmz\" (UID: \"42099723-6874-4d2d-a1ee-e7fd8db3f66c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989582 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989642 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-dir\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989731 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989762 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2bedb86-27a0-40a4-a97c-10f1f287fc00-etcd-client\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989818 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9fa2d8-df01-417a-9170-d1e8288e4111-serving-cert\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989847 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989898 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bh8d\" (UniqueName: \"kubernetes.io/projected/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-kube-api-access-7bh8d\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989926 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-etcd-serving-ca\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.989989 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-oauth-config\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990026 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-oauth-serving-cert\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990086 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fab39d74-56e2-462a-858d-a255438e06ef-srv-cert\") pod \"catalog-operator-68c6474976-f8mlp\" (UID: \"fab39d74-56e2-462a-858d-a255438e06ef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990139 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990173 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-config\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990230 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990265 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp7zf\" (UniqueName: \"kubernetes.io/projected/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-kube-api-access-hp7zf\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990318 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-encryption-config\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990345 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ba57cc0-b212-4e46-a3b6-98fba822c17d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7zczz\" (UID: \"4ba57cc0-b212-4e46-a3b6-98fba822c17d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990396 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26c329d7-2138-4f3b-81cd-4b8c0a595a27-metrics-tls\") pod \"dns-operator-744455d44c-mvf4s\" (UID: \"26c329d7-2138-4f3b-81cd-4b8c0a595a27\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990424 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e4c597-7103-43b5-a54b-4a0cf131a749-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9fk82\" (UID: \"22e4c597-7103-43b5-a54b-4a0cf131a749\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990466 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnjwz\" (UniqueName: \"kubernetes.io/projected/22e4c597-7103-43b5-a54b-4a0cf131a749-kube-api-access-cnjwz\") pod \"openshift-controller-manager-operator-756b6f6bc6-9fk82\" (UID: \"22e4c597-7103-43b5-a54b-4a0cf131a749\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990494 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqkbl\" (UniqueName: \"kubernetes.io/projected/4f9fa2d8-df01-417a-9170-d1e8288e4111-kube-api-access-hqkbl\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990543 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-audit\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990575 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rj8j\" (UniqueName: \"kubernetes.io/projected/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-kube-api-access-5rj8j\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990622 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e4c597-7103-43b5-a54b-4a0cf131a749-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9fk82\" (UID: \"22e4c597-7103-43b5-a54b-4a0cf131a749\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990650 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxjhs\" (UniqueName: \"kubernetes.io/projected/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-kube-api-access-fxjhs\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990672 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0a8c78b-296b-4baa-91f2-bb9696fdc134-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hkhgj\" (UID: \"f0a8c78b-296b-4baa-91f2-bb9696fdc134\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990748 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-config\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990804 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml4xf\" (UniqueName: \"kubernetes.io/projected/3d68da4e-0dc1-4835-99f3-a5703db9288e-kube-api-access-ml4xf\") pod \"downloads-7954f5f757-pz4gm\" (UID: \"3d68da4e-0dc1-4835-99f3-a5703db9288e\") " pod="openshift-console/downloads-7954f5f757-pz4gm" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990831 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f2bedb86-27a0-40a4-a97c-10f1f287fc00-encryption-config\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990876 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvnlv\" (UniqueName: \"kubernetes.io/projected/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-kube-api-access-bvnlv\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990903 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-trusted-ca-bundle\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990947 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvww\" (UniqueName: \"kubernetes.io/projected/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-kube-api-access-hfvww\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.990977 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2bedb86-27a0-40a4-a97c-10f1f287fc00-serving-cert\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991031 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ff332c4-5e3a-4d2d-a694-870559724211-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7lqkl\" (UID: \"9ff332c4-5e3a-4d2d-a694-870559724211\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991071 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-etcd-service-ca\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991130 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-audit-dir\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991162 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-images\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991217 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-client-ca\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991246 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f9fa2d8-df01-417a-9170-d1e8288e4111-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991306 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7kts\" (UniqueName: \"kubernetes.io/projected/fab39d74-56e2-462a-858d-a255438e06ef-kube-api-access-p7kts\") pod \"catalog-operator-68c6474976-f8mlp\" (UID: \"fab39d74-56e2-462a-858d-a255438e06ef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991363 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2bedb86-27a0-40a4-a97c-10f1f287fc00-audit-policies\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991398 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991451 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-etcd-ca\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991481 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-etcd-client\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991554 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-config\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991620 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f2bedb86-27a0-40a4-a97c-10f1f287fc00-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991655 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991717 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-serving-cert\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991777 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991809 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991864 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-policies\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991897 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-serving-cert\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.991979 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftkc8\" (UniqueName: \"kubernetes.io/projected/26c329d7-2138-4f3b-81cd-4b8c0a595a27-kube-api-access-ftkc8\") pod \"dns-operator-744455d44c-mvf4s\" (UID: \"26c329d7-2138-4f3b-81cd-4b8c0a595a27\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992038 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2fpr\" (UniqueName: \"kubernetes.io/projected/5189c318-e4b1-4dd9-9a6d-284425d319cf-kube-api-access-q2fpr\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992112 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlbg6\" (UniqueName: \"kubernetes.io/projected/f0a8c78b-296b-4baa-91f2-bb9696fdc134-kube-api-access-wlbg6\") pod \"openshift-apiserver-operator-796bbdcf4f-hkhgj\" (UID: \"f0a8c78b-296b-4baa-91f2-bb9696fdc134\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992151 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-audit-dir\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992145 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fab39d74-56e2-462a-858d-a255438e06ef-profile-collector-cert\") pod \"catalog-operator-68c6474976-f8mlp\" (UID: \"fab39d74-56e2-462a-858d-a255438e06ef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992227 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992284 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-node-pullsecrets\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992315 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trkkh\" (UniqueName: \"kubernetes.io/projected/42099723-6874-4d2d-a1ee-e7fd8db3f66c-kube-api-access-trkkh\") pod \"openshift-config-operator-7777fb866f-8shmz\" (UID: \"42099723-6874-4d2d-a1ee-e7fd8db3f66c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992374 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmr29\" (UniqueName: \"kubernetes.io/projected/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-kube-api-access-xmr29\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992432 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhz98\" (UniqueName: \"kubernetes.io/projected/9ff332c4-5e3a-4d2d-a694-870559724211-kube-api-access-vhz98\") pod \"control-plane-machine-set-operator-78cbb6b69f-7lqkl\" (UID: \"9ff332c4-5e3a-4d2d-a694-870559724211\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992466 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992514 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7fj9\" (UniqueName: \"kubernetes.io/projected/ef184530-a1ee-415c-a683-0588bf7f3ffb-kube-api-access-r7fj9\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992597 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8md7h\" (UniqueName: \"kubernetes.io/projected/117e7cf2-e68f-423d-a312-f1d63c3b815b-kube-api-access-8md7h\") pod \"cluster-samples-operator-665b6dd947-wbl8z\" (UID: \"117e7cf2-e68f-423d-a312-f1d63c3b815b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992659 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992694 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992757 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-serving-cert\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992816 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992856 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-webhook-cert\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992918 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-config\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.992948 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-service-ca\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.993005 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-config\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.993073 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2bedb86-27a0-40a4-a97c-10f1f287fc00-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.993104 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.993155 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-trusted-ca\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.993229 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9fa2d8-df01-417a-9170-d1e8288e4111-config\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.993260 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42099723-6874-4d2d-a1ee-e7fd8db3f66c-serving-cert\") pod \"openshift-config-operator-7777fb866f-8shmz\" (UID: \"42099723-6874-4d2d-a1ee-e7fd8db3f66c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.993284 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ba57cc0-b212-4e46-a3b6-98fba822c17d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7zczz\" (UID: \"4ba57cc0-b212-4e46-a3b6-98fba822c17d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.993339 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba57cc0-b212-4e46-a3b6-98fba822c17d-config\") pod \"kube-apiserver-operator-766d6c64bb-7zczz\" (UID: \"4ba57cc0-b212-4e46-a3b6-98fba822c17d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.993423 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2bedb86-27a0-40a4-a97c-10f1f287fc00-audit-dir\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.993733 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e4c597-7103-43b5-a54b-4a0cf131a749-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9fk82\" (UID: \"22e4c597-7103-43b5-a54b-4a0cf131a749\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.993800 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-audit\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.994211 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-config\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.994341 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-trusted-ca-bundle\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.994475 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-node-pullsecrets\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.995277 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.996446 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.997830 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-policies\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.997988 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26c329d7-2138-4f3b-81cd-4b8c0a595a27-metrics-tls\") pod \"dns-operator-744455d44c-mvf4s\" (UID: \"26c329d7-2138-4f3b-81cd-4b8c0a595a27\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.998651 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef184530-a1ee-415c-a683-0588bf7f3ffb-serving-cert\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.999152 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-images\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.999305 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2bedb86-27a0-40a4-a97c-10f1f287fc00-audit-dir\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.999633 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-service-ca\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:39 crc kubenswrapper[4918]: I0319 16:41:39.999678 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.000403 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-config\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.000754 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.000991 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-config\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.001046 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-serving-cert\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.001458 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f2bedb86-27a0-40a4-a97c-10f1f287fc00-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.001734 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-etcd-service-ca\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.001732 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-config\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.002136 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/42099723-6874-4d2d-a1ee-e7fd8db3f66c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8shmz\" (UID: \"42099723-6874-4d2d-a1ee-e7fd8db3f66c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.002150 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-etcd-ca\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.002270 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.003258 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-config\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.004972 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f9fa2d8-df01-417a-9170-d1e8288e4111-service-ca-bundle\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.005312 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f9fa2d8-df01-417a-9170-d1e8288e4111-config\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.005404 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-oauth-serving-cert\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.005498 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.005740 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-dir\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.005842 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f2bedb86-27a0-40a4-a97c-10f1f287fc00-audit-policies\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.006179 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2bedb86-27a0-40a4-a97c-10f1f287fc00-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.007649 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-client-ca\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.008224 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-etcd-serving-ca\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.008233 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.008242 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.008850 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.008968 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f2bedb86-27a0-40a4-a97c-10f1f287fc00-encryption-config\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.009200 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.009277 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-serving-cert\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.009441 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.009584 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f9fa2d8-df01-417a-9170-d1e8288e4111-serving-cert\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.009580 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f9fa2d8-df01-417a-9170-d1e8288e4111-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.009674 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.010281 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.010592 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42099723-6874-4d2d-a1ee-e7fd8db3f66c-serving-cert\") pod \"openshift-config-operator-7777fb866f-8shmz\" (UID: \"42099723-6874-4d2d-a1ee-e7fd8db3f66c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.010993 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2bedb86-27a0-40a4-a97c-10f1f287fc00-etcd-client\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.011349 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-image-import-ca\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.011593 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-serving-cert\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.011871 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-encryption-config\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.012225 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.012690 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2bedb86-27a0-40a4-a97c-10f1f287fc00-serving-cert\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.012774 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-oauth-config\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.012884 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.013661 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-etcd-client\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.013848 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.013869 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-etcd-client\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.014874 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e4c597-7103-43b5-a54b-4a0cf131a749-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9fk82\" (UID: \"22e4c597-7103-43b5-a54b-4a0cf131a749\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.016783 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.019902 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/117e7cf2-e68f-423d-a312-f1d63c3b815b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wbl8z\" (UID: \"117e7cf2-e68f-423d-a312-f1d63c3b815b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.026266 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.045319 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.065620 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.085880 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094149 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fab39d74-56e2-462a-858d-a255438e06ef-srv-cert\") pod \"catalog-operator-68c6474976-f8mlp\" (UID: \"fab39d74-56e2-462a-858d-a255438e06ef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094206 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ba57cc0-b212-4e46-a3b6-98fba822c17d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7zczz\" (UID: \"4ba57cc0-b212-4e46-a3b6-98fba822c17d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094277 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0a8c78b-296b-4baa-91f2-bb9696fdc134-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hkhgj\" (UID: \"f0a8c78b-296b-4baa-91f2-bb9696fdc134\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094317 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvww\" (UniqueName: \"kubernetes.io/projected/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-kube-api-access-hfvww\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094340 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ff332c4-5e3a-4d2d-a694-870559724211-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7lqkl\" (UID: \"9ff332c4-5e3a-4d2d-a694-870559724211\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094369 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7kts\" (UniqueName: \"kubernetes.io/projected/fab39d74-56e2-462a-858d-a255438e06ef-kube-api-access-p7kts\") pod \"catalog-operator-68c6474976-f8mlp\" (UID: \"fab39d74-56e2-462a-858d-a255438e06ef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094404 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-serving-cert\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094474 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlbg6\" (UniqueName: \"kubernetes.io/projected/f0a8c78b-296b-4baa-91f2-bb9696fdc134-kube-api-access-wlbg6\") pod \"openshift-apiserver-operator-796bbdcf4f-hkhgj\" (UID: \"f0a8c78b-296b-4baa-91f2-bb9696fdc134\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094499 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fab39d74-56e2-462a-858d-a255438e06ef-profile-collector-cert\") pod \"catalog-operator-68c6474976-f8mlp\" (UID: \"fab39d74-56e2-462a-858d-a255438e06ef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094559 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmr29\" (UniqueName: \"kubernetes.io/projected/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-kube-api-access-xmr29\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094585 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhz98\" (UniqueName: \"kubernetes.io/projected/9ff332c4-5e3a-4d2d-a694-870559724211-kube-api-access-vhz98\") pod \"control-plane-machine-set-operator-78cbb6b69f-7lqkl\" (UID: \"9ff332c4-5e3a-4d2d-a694-870559724211\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094616 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-webhook-cert\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094642 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-config\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094660 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-trusted-ca\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094696 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ba57cc0-b212-4e46-a3b6-98fba822c17d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7zczz\" (UID: \"4ba57cc0-b212-4e46-a3b6-98fba822c17d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094718 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba57cc0-b212-4e46-a3b6-98fba822c17d-config\") pod \"kube-apiserver-operator-766d6c64bb-7zczz\" (UID: \"4ba57cc0-b212-4e46-a3b6-98fba822c17d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094749 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-apiservice-cert\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094765 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0a8c78b-296b-4baa-91f2-bb9696fdc134-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hkhgj\" (UID: \"f0a8c78b-296b-4baa-91f2-bb9696fdc134\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.094790 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-tmpfs\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.095661 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-tmpfs\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.096763 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0a8c78b-296b-4baa-91f2-bb9696fdc134-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hkhgj\" (UID: \"f0a8c78b-296b-4baa-91f2-bb9696fdc134\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.096779 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-config\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.097148 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-trusted-ca\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.098049 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0a8c78b-296b-4baa-91f2-bb9696fdc134-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hkhgj\" (UID: \"f0a8c78b-296b-4baa-91f2-bb9696fdc134\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.101883 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-serving-cert\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.105487 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.125456 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.146293 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.185695 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.206158 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.226780 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.247621 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.265393 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.285565 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.297743 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.298002 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.298163 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.298337 4918 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.298385 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:56.298370248 +0000 UTC m=+128.420569496 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.298700 4918 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.298829 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:56.298799212 +0000 UTC m=+128.420998490 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.298861 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:56.298845993 +0000 UTC m=+128.421045281 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.306231 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.325620 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.345380 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.365337 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.386424 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.399012 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.399118 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.399243 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.399269 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.399283 4918 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.399343 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:56.399326126 +0000 UTC m=+128.521525394 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.399243 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.399375 4918 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.399387 4918 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:40 crc kubenswrapper[4918]: E0319 16:41:40.399424 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 16:41:56.39941585 +0000 UTC m=+128.521615108 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.404721 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.425118 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.446856 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.466336 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.486808 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.506364 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.534704 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.545352 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.566011 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.585743 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.585767 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.585987 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.586665 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.606082 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.625597 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.645098 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.650583 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ba57cc0-b212-4e46-a3b6-98fba822c17d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7zczz\" (UID: \"4ba57cc0-b212-4e46-a3b6-98fba822c17d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.665450 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.675899 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ba57cc0-b212-4e46-a3b6-98fba822c17d-config\") pod \"kube-apiserver-operator-766d6c64bb-7zczz\" (UID: \"4ba57cc0-b212-4e46-a3b6-98fba822c17d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.685852 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.706280 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.726093 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.744848 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.767888 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.785729 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.805026 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.825636 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.840627 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ff332c4-5e3a-4d2d-a694-870559724211-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7lqkl\" (UID: \"9ff332c4-5e3a-4d2d-a694-870559724211\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.845928 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.865158 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.883827 4918 request.go:700] Waited for 1.009664614s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.886627 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.901860 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fab39d74-56e2-462a-858d-a255438e06ef-profile-collector-cert\") pod \"catalog-operator-68c6474976-f8mlp\" (UID: \"fab39d74-56e2-462a-858d-a255438e06ef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.905365 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.926830 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.947129 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.966249 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.979628 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fab39d74-56e2-462a-858d-a255438e06ef-srv-cert\") pod \"catalog-operator-68c6474976-f8mlp\" (UID: \"fab39d74-56e2-462a-858d-a255438e06ef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:40 crc kubenswrapper[4918]: I0319 16:41:40.986526 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.014487 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.033457 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.046046 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.065950 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.085657 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 16:41:41 crc kubenswrapper[4918]: E0319 16:41:41.095695 4918 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 16:41:41 crc kubenswrapper[4918]: E0319 16:41:41.095708 4918 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 16:41:41 crc kubenswrapper[4918]: E0319 16:41:41.095889 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-apiservice-cert podName:dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf nodeName:}" failed. No retries permitted until 2026-03-19 16:41:41.595855792 +0000 UTC m=+113.718055080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-apiservice-cert") pod "packageserver-d55dfcdfc-vtw7s" (UID: "dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf") : failed to sync secret cache: timed out waiting for the condition Mar 19 16:41:41 crc kubenswrapper[4918]: E0319 16:41:41.096030 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-webhook-cert podName:dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf nodeName:}" failed. No retries permitted until 2026-03-19 16:41:41.595993036 +0000 UTC m=+113.718192324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-webhook-cert") pod "packageserver-d55dfcdfc-vtw7s" (UID: "dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf") : failed to sync secret cache: timed out waiting for the condition Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.104779 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.125646 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.145617 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.166889 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.185908 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.206509 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.226048 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.247683 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.266324 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.286315 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.306257 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.325966 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.345643 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.367591 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.406381 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.425069 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.446626 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.465984 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.485146 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.505581 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.525549 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.545635 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.565544 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.586001 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.586208 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.605374 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.626258 4918 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.627321 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-apiservice-cert\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.627908 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-webhook-cert\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.634834 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-webhook-cert\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.638341 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-apiservice-cert\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.646375 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.668326 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.686136 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.705263 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.726193 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.771107 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rj8j\" (UniqueName: \"kubernetes.io/projected/9e8d955e-01e0-4fe0-a713-20f4e83f8cca-kube-api-access-5rj8j\") pod \"apiserver-76f77b778f-mnwrv\" (UID: \"9e8d955e-01e0-4fe0-a713-20f4e83f8cca\") " pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.803316 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnjwz\" (UniqueName: \"kubernetes.io/projected/22e4c597-7103-43b5-a54b-4a0cf131a749-kube-api-access-cnjwz\") pod \"openshift-controller-manager-operator-756b6f6bc6-9fk82\" (UID: \"22e4c597-7103-43b5-a54b-4a0cf131a749\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.818193 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqkbl\" (UniqueName: \"kubernetes.io/projected/4f9fa2d8-df01-417a-9170-d1e8288e4111-kube-api-access-hqkbl\") pod \"authentication-operator-69f744f599-gbfq6\" (UID: \"4f9fa2d8-df01-417a-9170-d1e8288e4111\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.834017 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxjhs\" (UniqueName: \"kubernetes.io/projected/d9b7f6a4-5987-4b92-b063-2ddf9ad42074-kube-api-access-fxjhs\") pod \"machine-api-operator-5694c8668f-pz8dd\" (UID: \"d9b7f6a4-5987-4b92-b063-2ddf9ad42074\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.847004 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.855491 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml4xf\" (UniqueName: \"kubernetes.io/projected/3d68da4e-0dc1-4835-99f3-a5703db9288e-kube-api-access-ml4xf\") pod \"downloads-7954f5f757-pz4gm\" (UID: \"3d68da4e-0dc1-4835-99f3-a5703db9288e\") " pod="openshift-console/downloads-7954f5f757-pz4gm" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.879643 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvnlv\" (UniqueName: \"kubernetes.io/projected/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-kube-api-access-bvnlv\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.897273 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trkkh\" (UniqueName: \"kubernetes.io/projected/42099723-6874-4d2d-a1ee-e7fd8db3f66c-kube-api-access-trkkh\") pod \"openshift-config-operator-7777fb866f-8shmz\" (UID: \"42099723-6874-4d2d-a1ee-e7fd8db3f66c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.904695 4918 request.go:700] Waited for 1.909471747s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.917463 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftkc8\" (UniqueName: \"kubernetes.io/projected/26c329d7-2138-4f3b-81cd-4b8c0a595a27-kube-api-access-ftkc8\") pod \"dns-operator-744455d44c-mvf4s\" (UID: \"26c329d7-2138-4f3b-81cd-4b8c0a595a27\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.929747 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.931756 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2fpr\" (UniqueName: \"kubernetes.io/projected/5189c318-e4b1-4dd9-9a6d-284425d319cf-kube-api-access-q2fpr\") pod \"console-f9d7485db-td7k5\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.942953 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.954769 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7fj9\" (UniqueName: \"kubernetes.io/projected/ef184530-a1ee-415c-a683-0588bf7f3ffb-kube-api-access-r7fj9\") pod \"controller-manager-879f6c89f-84tp9\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.960135 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8md7h\" (UniqueName: \"kubernetes.io/projected/117e7cf2-e68f-423d-a312-f1d63c3b815b-kube-api-access-8md7h\") pod \"cluster-samples-operator-665b6dd947-wbl8z\" (UID: \"117e7cf2-e68f-423d-a312-f1d63c3b815b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.971762 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-pz4gm" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.991233 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb114c91-d63c-4b6e-927f-cb68c2dcf04f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pmg4f\" (UID: \"bb114c91-d63c-4b6e-927f-cb68c2dcf04f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:41 crc kubenswrapper[4918]: I0319 16:41:41.997090 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.003968 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.021023 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.023095 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbxl\" (UniqueName: \"kubernetes.io/projected/f2bedb86-27a0-40a4-a97c-10f1f287fc00-kube-api-access-8vbxl\") pod \"apiserver-7bbb656c7d-kspkb\" (UID: \"f2bedb86-27a0-40a4-a97c-10f1f287fc00\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.029039 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.035024 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp7zf\" (UniqueName: \"kubernetes.io/projected/a440b991-4ffb-4d2e-aa90-fa5e731d9cff-kube-api-access-hp7zf\") pod \"etcd-operator-b45778765-bvkm6\" (UID: \"a440b991-4ffb-4d2e-aa90-fa5e731d9cff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.038716 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.042450 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bh8d\" (UniqueName: \"kubernetes.io/projected/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-kube-api-access-7bh8d\") pod \"oauth-openshift-558db77b4-h9xcq\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.050960 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.065317 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvww\" (UniqueName: \"kubernetes.io/projected/dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf-kube-api-access-hfvww\") pod \"packageserver-d55dfcdfc-vtw7s\" (UID: \"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.100118 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlbg6\" (UniqueName: \"kubernetes.io/projected/f0a8c78b-296b-4baa-91f2-bb9696fdc134-kube-api-access-wlbg6\") pod \"openshift-apiserver-operator-796bbdcf4f-hkhgj\" (UID: \"f0a8c78b-296b-4baa-91f2-bb9696fdc134\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.109883 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.111397 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7kts\" (UniqueName: \"kubernetes.io/projected/fab39d74-56e2-462a-858d-a255438e06ef-kube-api-access-p7kts\") pod \"catalog-operator-68c6474976-f8mlp\" (UID: \"fab39d74-56e2-462a-858d-a255438e06ef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.126922 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhz98\" (UniqueName: \"kubernetes.io/projected/9ff332c4-5e3a-4d2d-a694-870559724211-kube-api-access-vhz98\") pod \"control-plane-machine-set-operator-78cbb6b69f-7lqkl\" (UID: \"9ff332c4-5e3a-4d2d-a694-870559724211\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.139567 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ba57cc0-b212-4e46-a3b6-98fba822c17d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7zczz\" (UID: \"4ba57cc0-b212-4e46-a3b6-98fba822c17d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.162895 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmr29\" (UniqueName: \"kubernetes.io/projected/8848d8b1-3fa3-4d27-b9d7-803e2a884bfc-kube-api-access-xmr29\") pod \"console-operator-58897d9998-5c6nk\" (UID: \"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc\") " pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.187567 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.194778 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.197279 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.205915 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.229338 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.229566 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.229969 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.248025 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.255868 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.280558 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.286238 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.294902 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.311157 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341180 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr7g2\" (UniqueName: \"kubernetes.io/projected/517f8bf9-4476-4ca5-a28d-09c375b891fd-kube-api-access-zr7g2\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341241 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-registry-tls\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341267 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k4qc7\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341289 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgnqk\" (UniqueName: \"kubernetes.io/projected/f5b832a4-7fec-4d2f-a400-1e890bb551b4-kube-api-access-sgnqk\") pod \"marketplace-operator-79b997595-k4qc7\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341327 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7729m\" (UniqueName: \"kubernetes.io/projected/27c10200-5439-42ac-adfe-f5ab8de8b93f-kube-api-access-7729m\") pod \"olm-operator-6b444d44fb-bsb9j\" (UID: \"27c10200-5439-42ac-adfe-f5ab8de8b93f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341370 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8cxp\" (UniqueName: \"kubernetes.io/projected/e623c8c8-da7e-4af5-aae6-95bf127c57d7-kube-api-access-w8cxp\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbzd2\" (UID: \"e623c8c8-da7e-4af5-aae6-95bf127c57d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341407 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9417c6d-34fd-465b-b780-b88ee938f824-installation-pull-secrets\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341423 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhfk8\" (UniqueName: \"kubernetes.io/projected/9ab8ff58-9592-4b8c-8ca4-9b08b092e785-kube-api-access-hhfk8\") pod \"ingress-canary-fcbfm\" (UID: \"9ab8ff58-9592-4b8c-8ca4-9b08b092e785\") " pod="openshift-ingress-canary/ingress-canary-fcbfm" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341443 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-metrics-certs\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341458 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef12d86d-5629-4a7c-948c-9229440d073a-config\") pod \"service-ca-operator-777779d784-92pms\" (UID: \"ef12d86d-5629-4a7c-948c-9229440d073a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341479 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/517f8bf9-4476-4ca5-a28d-09c375b891fd-machine-approver-tls\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341501 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e623c8c8-da7e-4af5-aae6-95bf127c57d7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbzd2\" (UID: \"e623c8c8-da7e-4af5-aae6-95bf127c57d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341523 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3964a146-319a-4095-aec8-9469ea55705d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341636 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fea5d2be-19d3-4636-b55f-1254a484f36a-config\") pod \"kube-controller-manager-operator-78b949d7b-24fgt\" (UID: \"fea5d2be-19d3-4636-b55f-1254a484f36a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341654 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z55k4\" (UniqueName: \"kubernetes.io/projected/7de76665-62ca-42ca-94ad-537d7789d1a1-kube-api-access-z55k4\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341675 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ccf81a8c-f372-4162-81a1-a3df96e88bf1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-w2fmz\" (UID: \"ccf81a8c-f372-4162-81a1-a3df96e88bf1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341701 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxrx4\" (UniqueName: \"kubernetes.io/projected/ccf81a8c-f372-4162-81a1-a3df96e88bf1-kube-api-access-mxrx4\") pod \"multus-admission-controller-857f4d67dd-w2fmz\" (UID: \"ccf81a8c-f372-4162-81a1-a3df96e88bf1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341729 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fea5d2be-19d3-4636-b55f-1254a484f36a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-24fgt\" (UID: \"fea5d2be-19d3-4636-b55f-1254a484f36a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341746 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-service-ca-bundle\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341765 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9417c6d-34fd-465b-b780-b88ee938f824-ca-trust-extracted\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341784 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef12d86d-5629-4a7c-948c-9229440d073a-serving-cert\") pod \"service-ca-operator-777779d784-92pms\" (UID: \"ef12d86d-5629-4a7c-948c-9229440d073a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341803 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-default-certificate\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341832 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-config\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341863 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-registry-certificates\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341885 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhpl9\" (UniqueName: \"kubernetes.io/projected/49540bde-3719-4e5d-acf8-ee877d99f581-kube-api-access-lhpl9\") pod \"machine-config-controller-84d6567774-4ztb9\" (UID: \"49540bde-3719-4e5d-acf8-ee877d99f581\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341905 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d02d944-f817-4258-b5e9-70dae96b646d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7hntq\" (UID: \"7d02d944-f817-4258-b5e9-70dae96b646d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341927 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7c2e54b-e056-4df4-b565-45f068f9f8da-proxy-tls\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341968 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7c2e54b-e056-4df4-b565-45f068f9f8da-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.341987 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8vgs\" (UniqueName: \"kubernetes.io/projected/3c005691-aa02-475b-94fd-f8a16d0f4de5-kube-api-access-l8vgs\") pod \"package-server-manager-789f6589d5-gl4jw\" (UID: \"3c005691-aa02-475b-94fd-f8a16d0f4de5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342030 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llzxz\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-kube-api-access-llzxz\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342047 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-secret-volume\") pod \"collect-profiles-29565630-wz4vn\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342063 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3964a146-319a-4095-aec8-9469ea55705d-metrics-tls\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342090 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fea5d2be-19d3-4636-b55f-1254a484f36a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-24fgt\" (UID: \"fea5d2be-19d3-4636-b55f-1254a484f36a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342105 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7de76665-62ca-42ca-94ad-537d7789d1a1-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342121 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/517f8bf9-4476-4ca5-a28d-09c375b891fd-auth-proxy-config\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342149 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/517f8bf9-4476-4ca5-a28d-09c375b891fd-config\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342168 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/27c10200-5439-42ac-adfe-f5ab8de8b93f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bsb9j\" (UID: \"27c10200-5439-42ac-adfe-f5ab8de8b93f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342188 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c005691-aa02-475b-94fd-f8a16d0f4de5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gl4jw\" (UID: \"3c005691-aa02-475b-94fd-f8a16d0f4de5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342217 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k4qc7\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342233 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46cxb\" (UniqueName: \"kubernetes.io/projected/3964a146-319a-4095-aec8-9469ea55705d-kube-api-access-46cxb\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342262 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d02d944-f817-4258-b5e9-70dae96b646d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7hntq\" (UID: \"7d02d944-f817-4258-b5e9-70dae96b646d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342279 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-bound-sa-token\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342295 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3964a146-319a-4095-aec8-9469ea55705d-trusted-ca\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342313 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d02d944-f817-4258-b5e9-70dae96b646d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7hntq\" (UID: \"7d02d944-f817-4258-b5e9-70dae96b646d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342365 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxr2j\" (UniqueName: \"kubernetes.io/projected/7a6ce6f1-2aa5-44d7-b3f2-3b062ee500c0-kube-api-access-vxr2j\") pod \"migrator-59844c95c7-wvklt\" (UID: \"7a6ce6f1-2aa5-44d7-b3f2-3b062ee500c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342386 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvsc9\" (UniqueName: \"kubernetes.io/projected/ef12d86d-5629-4a7c-948c-9229440d073a-kube-api-access-mvsc9\") pod \"service-ca-operator-777779d784-92pms\" (UID: \"ef12d86d-5629-4a7c-948c-9229440d073a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342402 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-stats-auth\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342419 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-client-ca\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342542 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49540bde-3719-4e5d-acf8-ee877d99f581-proxy-tls\") pod \"machine-config-controller-84d6567774-4ztb9\" (UID: \"49540bde-3719-4e5d-acf8-ee877d99f581\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342563 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e623c8c8-da7e-4af5-aae6-95bf127c57d7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbzd2\" (UID: \"e623c8c8-da7e-4af5-aae6-95bf127c57d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342592 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342610 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-trusted-ca\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342629 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e7c2e54b-e056-4df4-b565-45f068f9f8da-images\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342647 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlv2z\" (UniqueName: \"kubernetes.io/projected/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-kube-api-access-hlv2z\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342663 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/27c10200-5439-42ac-adfe-f5ab8de8b93f-srv-cert\") pod \"olm-operator-6b444d44fb-bsb9j\" (UID: \"27c10200-5439-42ac-adfe-f5ab8de8b93f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342710 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89j9d\" (UniqueName: \"kubernetes.io/projected/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-kube-api-access-89j9d\") pod \"collect-profiles-29565630-wz4vn\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342746 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/49540bde-3719-4e5d-acf8-ee877d99f581-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4ztb9\" (UID: \"49540bde-3719-4e5d-acf8-ee877d99f581\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342764 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-config-volume\") pod \"collect-profiles-29565630-wz4vn\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342782 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gqb2\" (UniqueName: \"kubernetes.io/projected/e7c2e54b-e056-4df4-b565-45f068f9f8da-kube-api-access-6gqb2\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.342800 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ab8ff58-9592-4b8c-8ca4-9b08b092e785-cert\") pod \"ingress-canary-fcbfm\" (UID: \"9ab8ff58-9592-4b8c-8ca4-9b08b092e785\") " pod="openshift-ingress-canary/ingress-canary-fcbfm" Mar 19 16:41:42 crc kubenswrapper[4918]: E0319 16:41:42.345773 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:42.845742797 +0000 UTC m=+114.967942035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.353808 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.364889 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.372489 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mvf4s"] Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.387644 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-pz4gm"] Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.447337 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:42 crc kubenswrapper[4918]: E0319 16:41:42.447615 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:42.947507258 +0000 UTC m=+115.069706506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.447835 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/27c10200-5439-42ac-adfe-f5ab8de8b93f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bsb9j\" (UID: \"27c10200-5439-42ac-adfe-f5ab8de8b93f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.447935 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k4qc7\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.447994 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46cxb\" (UniqueName: \"kubernetes.io/projected/3964a146-319a-4095-aec8-9469ea55705d-kube-api-access-46cxb\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448028 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c005691-aa02-475b-94fd-f8a16d0f4de5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gl4jw\" (UID: \"3c005691-aa02-475b-94fd-f8a16d0f4de5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448087 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6756a1df-386b-4ee8-954b-bb4ae3829e58-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448143 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6ae59c41-bf25-4189-8a98-5e920f67a6ad-signing-key\") pod \"service-ca-9c57cc56f-clcn2\" (UID: \"6ae59c41-bf25-4189-8a98-5e920f67a6ad\") " pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448191 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d02d944-f817-4258-b5e9-70dae96b646d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7hntq\" (UID: \"7d02d944-f817-4258-b5e9-70dae96b646d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448224 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-bound-sa-token\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448259 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3964a146-319a-4095-aec8-9469ea55705d-trusted-ca\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448290 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d02d944-f817-4258-b5e9-70dae96b646d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7hntq\" (UID: \"7d02d944-f817-4258-b5e9-70dae96b646d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448327 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf8kz\" (UniqueName: \"kubernetes.io/projected/6ae59c41-bf25-4189-8a98-5e920f67a6ad-kube-api-access-xf8kz\") pod \"service-ca-9c57cc56f-clcn2\" (UID: \"6ae59c41-bf25-4189-8a98-5e920f67a6ad\") " pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448384 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxr2j\" (UniqueName: \"kubernetes.io/projected/7a6ce6f1-2aa5-44d7-b3f2-3b062ee500c0-kube-api-access-vxr2j\") pod \"migrator-59844c95c7-wvklt\" (UID: \"7a6ce6f1-2aa5-44d7-b3f2-3b062ee500c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448430 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvsc9\" (UniqueName: \"kubernetes.io/projected/ef12d86d-5629-4a7c-948c-9229440d073a-kube-api-access-mvsc9\") pod \"service-ca-operator-777779d784-92pms\" (UID: \"ef12d86d-5629-4a7c-948c-9229440d073a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448463 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-stats-auth\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448513 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-client-ca\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448656 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49540bde-3719-4e5d-acf8-ee877d99f581-proxy-tls\") pod \"machine-config-controller-84d6567774-4ztb9\" (UID: \"49540bde-3719-4e5d-acf8-ee877d99f581\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448699 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e623c8c8-da7e-4af5-aae6-95bf127c57d7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbzd2\" (UID: \"e623c8c8-da7e-4af5-aae6-95bf127c57d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448734 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-trusted-ca\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448765 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e7c2e54b-e056-4df4-b565-45f068f9f8da-images\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448805 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448840 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlv2z\" (UniqueName: \"kubernetes.io/projected/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-kube-api-access-hlv2z\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448874 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/27c10200-5439-42ac-adfe-f5ab8de8b93f-srv-cert\") pod \"olm-operator-6b444d44fb-bsb9j\" (UID: \"27c10200-5439-42ac-adfe-f5ab8de8b93f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448912 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pddg\" (UniqueName: \"kubernetes.io/projected/6756a1df-386b-4ee8-954b-bb4ae3829e58-kube-api-access-8pddg\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.448945 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3c343416-13c0-4513-a217-c885a3490f13-certs\") pod \"machine-config-server-nfhcd\" (UID: \"3c343416-13c0-4513-a217-c885a3490f13\") " pod="openshift-machine-config-operator/machine-config-server-nfhcd" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449006 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89j9d\" (UniqueName: \"kubernetes.io/projected/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-kube-api-access-89j9d\") pod \"collect-profiles-29565630-wz4vn\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449079 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/49540bde-3719-4e5d-acf8-ee877d99f581-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4ztb9\" (UID: \"49540bde-3719-4e5d-acf8-ee877d99f581\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449122 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-config-volume\") pod \"collect-profiles-29565630-wz4vn\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449189 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gqb2\" (UniqueName: \"kubernetes.io/projected/e7c2e54b-e056-4df4-b565-45f068f9f8da-kube-api-access-6gqb2\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449232 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/141bb3c3-ea35-4df5-8951-2c6423d792cc-metrics-tls\") pod \"dns-default-jb6n7\" (UID: \"141bb3c3-ea35-4df5-8951-2c6423d792cc\") " pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449272 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ab8ff58-9592-4b8c-8ca4-9b08b092e785-cert\") pod \"ingress-canary-fcbfm\" (UID: \"9ab8ff58-9592-4b8c-8ca4-9b08b092e785\") " pod="openshift-ingress-canary/ingress-canary-fcbfm" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449305 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-mountpoint-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449341 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr7g2\" (UniqueName: \"kubernetes.io/projected/517f8bf9-4476-4ca5-a28d-09c375b891fd-kube-api-access-zr7g2\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449374 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v266d\" (UniqueName: \"kubernetes.io/projected/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-kube-api-access-v266d\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449424 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-registry-tls\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449478 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k4qc7\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449797 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7729m\" (UniqueName: \"kubernetes.io/projected/27c10200-5439-42ac-adfe-f5ab8de8b93f-kube-api-access-7729m\") pod \"olm-operator-6b444d44fb-bsb9j\" (UID: \"27c10200-5439-42ac-adfe-f5ab8de8b93f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449827 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgnqk\" (UniqueName: \"kubernetes.io/projected/f5b832a4-7fec-4d2f-a400-1e890bb551b4-kube-api-access-sgnqk\") pod \"marketplace-operator-79b997595-k4qc7\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449870 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8cxp\" (UniqueName: \"kubernetes.io/projected/e623c8c8-da7e-4af5-aae6-95bf127c57d7-kube-api-access-w8cxp\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbzd2\" (UID: \"e623c8c8-da7e-4af5-aae6-95bf127c57d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449888 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8xgh\" (UniqueName: \"kubernetes.io/projected/141bb3c3-ea35-4df5-8951-2c6423d792cc-kube-api-access-g8xgh\") pod \"dns-default-jb6n7\" (UID: \"141bb3c3-ea35-4df5-8951-2c6423d792cc\") " pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449910 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9417c6d-34fd-465b-b780-b88ee938f824-installation-pull-secrets\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449928 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhfk8\" (UniqueName: \"kubernetes.io/projected/9ab8ff58-9592-4b8c-8ca4-9b08b092e785-kube-api-access-hhfk8\") pod \"ingress-canary-fcbfm\" (UID: \"9ab8ff58-9592-4b8c-8ca4-9b08b092e785\") " pod="openshift-ingress-canary/ingress-canary-fcbfm" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449955 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-metrics-certs\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449973 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef12d86d-5629-4a7c-948c-9229440d073a-config\") pod \"service-ca-operator-777779d784-92pms\" (UID: \"ef12d86d-5629-4a7c-948c-9229440d073a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.449991 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/517f8bf9-4476-4ca5-a28d-09c375b891fd-machine-approver-tls\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450008 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3964a146-319a-4095-aec8-9469ea55705d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450025 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e623c8c8-da7e-4af5-aae6-95bf127c57d7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbzd2\" (UID: \"e623c8c8-da7e-4af5-aae6-95bf127c57d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450043 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fea5d2be-19d3-4636-b55f-1254a484f36a-config\") pod \"kube-controller-manager-operator-78b949d7b-24fgt\" (UID: \"fea5d2be-19d3-4636-b55f-1254a484f36a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450069 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z55k4\" (UniqueName: \"kubernetes.io/projected/7de76665-62ca-42ca-94ad-537d7789d1a1-kube-api-access-z55k4\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450108 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ccf81a8c-f372-4162-81a1-a3df96e88bf1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-w2fmz\" (UID: \"ccf81a8c-f372-4162-81a1-a3df96e88bf1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450125 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6756a1df-386b-4ee8-954b-bb4ae3829e58-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450144 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxrx4\" (UniqueName: \"kubernetes.io/projected/ccf81a8c-f372-4162-81a1-a3df96e88bf1-kube-api-access-mxrx4\") pod \"multus-admission-controller-857f4d67dd-w2fmz\" (UID: \"ccf81a8c-f372-4162-81a1-a3df96e88bf1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450162 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-registration-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450187 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-csi-data-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450206 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fea5d2be-19d3-4636-b55f-1254a484f36a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-24fgt\" (UID: \"fea5d2be-19d3-4636-b55f-1254a484f36a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450223 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-service-ca-bundle\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450240 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9417c6d-34fd-465b-b780-b88ee938f824-ca-trust-extracted\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450257 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-socket-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450284 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef12d86d-5629-4a7c-948c-9229440d073a-serving-cert\") pod \"service-ca-operator-777779d784-92pms\" (UID: \"ef12d86d-5629-4a7c-948c-9229440d073a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450301 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-plugins-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450318 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-default-certificate\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450334 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-config\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450337 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k4qc7\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450350 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3c343416-13c0-4513-a217-c885a3490f13-node-bootstrap-token\") pod \"machine-config-server-nfhcd\" (UID: \"3c343416-13c0-4513-a217-c885a3490f13\") " pod="openshift-machine-config-operator/machine-config-server-nfhcd" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450424 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-registry-certificates\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450445 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhpl9\" (UniqueName: \"kubernetes.io/projected/49540bde-3719-4e5d-acf8-ee877d99f581-kube-api-access-lhpl9\") pod \"machine-config-controller-84d6567774-4ztb9\" (UID: \"49540bde-3719-4e5d-acf8-ee877d99f581\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450463 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d02d944-f817-4258-b5e9-70dae96b646d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7hntq\" (UID: \"7d02d944-f817-4258-b5e9-70dae96b646d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450489 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjj2g\" (UniqueName: \"kubernetes.io/projected/3c343416-13c0-4513-a217-c885a3490f13-kube-api-access-cjj2g\") pod \"machine-config-server-nfhcd\" (UID: \"3c343416-13c0-4513-a217-c885a3490f13\") " pod="openshift-machine-config-operator/machine-config-server-nfhcd" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450540 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7c2e54b-e056-4df4-b565-45f068f9f8da-proxy-tls\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450558 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6ae59c41-bf25-4189-8a98-5e920f67a6ad-signing-cabundle\") pod \"service-ca-9c57cc56f-clcn2\" (UID: \"6ae59c41-bf25-4189-8a98-5e920f67a6ad\") " pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450579 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7c2e54b-e056-4df4-b565-45f068f9f8da-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450617 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8vgs\" (UniqueName: \"kubernetes.io/projected/3c005691-aa02-475b-94fd-f8a16d0f4de5-kube-api-access-l8vgs\") pod \"package-server-manager-789f6589d5-gl4jw\" (UID: \"3c005691-aa02-475b-94fd-f8a16d0f4de5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450663 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-secret-volume\") pod \"collect-profiles-29565630-wz4vn\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450679 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3964a146-319a-4095-aec8-9469ea55705d-metrics-tls\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450700 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llzxz\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-kube-api-access-llzxz\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450730 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6756a1df-386b-4ee8-954b-bb4ae3829e58-ready\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450769 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7de76665-62ca-42ca-94ad-537d7789d1a1-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450789 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/517f8bf9-4476-4ca5-a28d-09c375b891fd-auth-proxy-config\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450813 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fea5d2be-19d3-4636-b55f-1254a484f36a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-24fgt\" (UID: \"fea5d2be-19d3-4636-b55f-1254a484f36a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450834 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/141bb3c3-ea35-4df5-8951-2c6423d792cc-config-volume\") pod \"dns-default-jb6n7\" (UID: \"141bb3c3-ea35-4df5-8951-2c6423d792cc\") " pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.450866 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/517f8bf9-4476-4ca5-a28d-09c375b891fd-config\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.451330 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/517f8bf9-4476-4ca5-a28d-09c375b891fd-config\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.452798 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d02d944-f817-4258-b5e9-70dae96b646d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7hntq\" (UID: \"7d02d944-f817-4258-b5e9-70dae96b646d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.452879 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/27c10200-5439-42ac-adfe-f5ab8de8b93f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bsb9j\" (UID: \"27c10200-5439-42ac-adfe-f5ab8de8b93f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.453351 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7c2e54b-e056-4df4-b565-45f068f9f8da-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.454220 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-registry-certificates\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.454593 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/49540bde-3719-4e5d-acf8-ee877d99f581-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4ztb9\" (UID: \"49540bde-3719-4e5d-acf8-ee877d99f581\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.454781 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-client-ca\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: E0319 16:41:42.455549 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:42.955518395 +0000 UTC m=+115.077717643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.456158 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-config-volume\") pod \"collect-profiles-29565630-wz4vn\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.467487 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/517f8bf9-4476-4ca5-a28d-09c375b891fd-machine-approver-tls\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: W0319 16:41:42.467615 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d68da4e_0dc1_4835_99f3_a5703db9288e.slice/crio-574664858c4a0bdaed0918a9096d35623318edc00e84394097e64e036bf075a2 WatchSource:0}: Error finding container 574664858c4a0bdaed0918a9096d35623318edc00e84394097e64e036bf075a2: Status 404 returned error can't find the container with id 574664858c4a0bdaed0918a9096d35623318edc00e84394097e64e036bf075a2 Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.468600 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-service-ca-bundle\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.468829 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3964a146-319a-4095-aec8-9469ea55705d-trusted-ca\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.469031 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9417c6d-34fd-465b-b780-b88ee938f824-ca-trust-extracted\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.469147 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7c2e54b-e056-4df4-b565-45f068f9f8da-proxy-tls\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.469350 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e623c8c8-da7e-4af5-aae6-95bf127c57d7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbzd2\" (UID: \"e623c8c8-da7e-4af5-aae6-95bf127c57d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.469405 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/517f8bf9-4476-4ca5-a28d-09c375b891fd-auth-proxy-config\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.469915 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fea5d2be-19d3-4636-b55f-1254a484f36a-config\") pod \"kube-controller-manager-operator-78b949d7b-24fgt\" (UID: \"fea5d2be-19d3-4636-b55f-1254a484f36a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.470826 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ccf81a8c-f372-4162-81a1-a3df96e88bf1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-w2fmz\" (UID: \"ccf81a8c-f372-4162-81a1-a3df96e88bf1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.472794 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-trusted-ca\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.473418 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7de76665-62ca-42ca-94ad-537d7789d1a1-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.475277 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-config\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.475695 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef12d86d-5629-4a7c-948c-9229440d073a-config\") pod \"service-ca-operator-777779d784-92pms\" (UID: \"ef12d86d-5629-4a7c-948c-9229440d073a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.476733 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-stats-auth\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.476860 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c005691-aa02-475b-94fd-f8a16d0f4de5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gl4jw\" (UID: \"3c005691-aa02-475b-94fd-f8a16d0f4de5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.477265 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e7c2e54b-e056-4df4-b565-45f068f9f8da-images\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: W0319 16:41:42.481035 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26c329d7_2138_4f3b_81cd_4b8c0a595a27.slice/crio-cfc02d476f99b2212cc6dd19492cd3bfbc5fa452fa824d711de59435eb021b2e WatchSource:0}: Error finding container cfc02d476f99b2212cc6dd19492cd3bfbc5fa452fa824d711de59435eb021b2e: Status 404 returned error can't find the container with id cfc02d476f99b2212cc6dd19492cd3bfbc5fa452fa824d711de59435eb021b2e Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.482845 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-secret-volume\") pod \"collect-profiles-29565630-wz4vn\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.484880 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-default-certificate\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.498363 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8shmz"] Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.498401 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fea5d2be-19d3-4636-b55f-1254a484f36a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-24fgt\" (UID: \"fea5d2be-19d3-4636-b55f-1254a484f36a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.500194 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-registry-tls\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.498966 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49540bde-3719-4e5d-acf8-ee877d99f581-proxy-tls\") pod \"machine-config-controller-84d6567774-4ztb9\" (UID: \"49540bde-3719-4e5d-acf8-ee877d99f581\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.499340 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d02d944-f817-4258-b5e9-70dae96b646d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7hntq\" (UID: \"7d02d944-f817-4258-b5e9-70dae96b646d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.499584 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9417c6d-34fd-465b-b780-b88ee938f824-installation-pull-secrets\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.499134 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-metrics-certs\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.499766 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e623c8c8-da7e-4af5-aae6-95bf127c57d7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbzd2\" (UID: \"e623c8c8-da7e-4af5-aae6-95bf127c57d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.500124 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ab8ff58-9592-4b8c-8ca4-9b08b092e785-cert\") pod \"ingress-canary-fcbfm\" (UID: \"9ab8ff58-9592-4b8c-8ca4-9b08b092e785\") " pod="openshift-ingress-canary/ingress-canary-fcbfm" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.500166 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3964a146-319a-4095-aec8-9469ea55705d-metrics-tls\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.507104 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k4qc7\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.516173 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/27c10200-5439-42ac-adfe-f5ab8de8b93f-srv-cert\") pod \"olm-operator-6b444d44fb-bsb9j\" (UID: \"27c10200-5439-42ac-adfe-f5ab8de8b93f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.517747 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d02d944-f817-4258-b5e9-70dae96b646d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7hntq\" (UID: \"7d02d944-f817-4258-b5e9-70dae96b646d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.540474 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82"] Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.551499 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gbfq6"] Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552014 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552280 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6756a1df-386b-4ee8-954b-bb4ae3829e58-ready\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552312 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/141bb3c3-ea35-4df5-8951-2c6423d792cc-config-volume\") pod \"dns-default-jb6n7\" (UID: \"141bb3c3-ea35-4df5-8951-2c6423d792cc\") " pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552349 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6756a1df-386b-4ee8-954b-bb4ae3829e58-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552367 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6ae59c41-bf25-4189-8a98-5e920f67a6ad-signing-key\") pod \"service-ca-9c57cc56f-clcn2\" (UID: \"6ae59c41-bf25-4189-8a98-5e920f67a6ad\") " pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552393 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf8kz\" (UniqueName: \"kubernetes.io/projected/6ae59c41-bf25-4189-8a98-5e920f67a6ad-kube-api-access-xf8kz\") pod \"service-ca-9c57cc56f-clcn2\" (UID: \"6ae59c41-bf25-4189-8a98-5e920f67a6ad\") " pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" Mar 19 16:41:42 crc kubenswrapper[4918]: E0319 16:41:42.552543 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:43.052473185 +0000 UTC m=+115.174672443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552642 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pddg\" (UniqueName: \"kubernetes.io/projected/6756a1df-386b-4ee8-954b-bb4ae3829e58-kube-api-access-8pddg\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552682 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3c343416-13c0-4513-a217-c885a3490f13-certs\") pod \"machine-config-server-nfhcd\" (UID: \"3c343416-13c0-4513-a217-c885a3490f13\") " pod="openshift-machine-config-operator/machine-config-server-nfhcd" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552767 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/141bb3c3-ea35-4df5-8951-2c6423d792cc-metrics-tls\") pod \"dns-default-jb6n7\" (UID: \"141bb3c3-ea35-4df5-8951-2c6423d792cc\") " pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552813 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-mountpoint-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552847 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v266d\" (UniqueName: \"kubernetes.io/projected/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-kube-api-access-v266d\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552927 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8xgh\" (UniqueName: \"kubernetes.io/projected/141bb3c3-ea35-4df5-8951-2c6423d792cc-kube-api-access-g8xgh\") pod \"dns-default-jb6n7\" (UID: \"141bb3c3-ea35-4df5-8951-2c6423d792cc\") " pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.552936 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6756a1df-386b-4ee8-954b-bb4ae3829e58-ready\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553036 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6756a1df-386b-4ee8-954b-bb4ae3829e58-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553065 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-registration-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553094 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-csi-data-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553130 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-socket-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553156 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-plugins-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553215 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3c343416-13c0-4513-a217-c885a3490f13-node-bootstrap-token\") pod \"machine-config-server-nfhcd\" (UID: \"3c343416-13c0-4513-a217-c885a3490f13\") " pod="openshift-machine-config-operator/machine-config-server-nfhcd" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553235 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/141bb3c3-ea35-4df5-8951-2c6423d792cc-config-volume\") pod \"dns-default-jb6n7\" (UID: \"141bb3c3-ea35-4df5-8951-2c6423d792cc\") " pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553247 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjj2g\" (UniqueName: \"kubernetes.io/projected/3c343416-13c0-4513-a217-c885a3490f13-kube-api-access-cjj2g\") pod \"machine-config-server-nfhcd\" (UID: \"3c343416-13c0-4513-a217-c885a3490f13\") " pod="openshift-machine-config-operator/machine-config-server-nfhcd" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553281 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6ae59c41-bf25-4189-8a98-5e920f67a6ad-signing-cabundle\") pod \"service-ca-9c57cc56f-clcn2\" (UID: \"6ae59c41-bf25-4189-8a98-5e920f67a6ad\") " pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553590 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6756a1df-386b-4ee8-954b-bb4ae3829e58-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553675 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6756a1df-386b-4ee8-954b-bb4ae3829e58-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.553730 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-mountpoint-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.554192 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-socket-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.554243 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6ae59c41-bf25-4189-8a98-5e920f67a6ad-signing-cabundle\") pod \"service-ca-9c57cc56f-clcn2\" (UID: \"6ae59c41-bf25-4189-8a98-5e920f67a6ad\") " pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.554257 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-registration-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.554326 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-csi-data-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.561337 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-plugins-dir\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.565387 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-bound-sa-token\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.569936 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46cxb\" (UniqueName: \"kubernetes.io/projected/3964a146-319a-4095-aec8-9469ea55705d-kube-api-access-46cxb\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.571360 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef12d86d-5629-4a7c-948c-9229440d073a-serving-cert\") pod \"service-ca-operator-777779d784-92pms\" (UID: \"ef12d86d-5629-4a7c-948c-9229440d073a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.571764 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6ae59c41-bf25-4189-8a98-5e920f67a6ad-signing-key\") pod \"service-ca-9c57cc56f-clcn2\" (UID: \"6ae59c41-bf25-4189-8a98-5e920f67a6ad\") " pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.580723 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/141bb3c3-ea35-4df5-8951-2c6423d792cc-metrics-tls\") pod \"dns-default-jb6n7\" (UID: \"141bb3c3-ea35-4df5-8951-2c6423d792cc\") " pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.581735 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxr2j\" (UniqueName: \"kubernetes.io/projected/7a6ce6f1-2aa5-44d7-b3f2-3b062ee500c0-kube-api-access-vxr2j\") pod \"migrator-59844c95c7-wvklt\" (UID: \"7a6ce6f1-2aa5-44d7-b3f2-3b062ee500c0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.583086 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3c343416-13c0-4513-a217-c885a3490f13-node-bootstrap-token\") pod \"machine-config-server-nfhcd\" (UID: \"3c343416-13c0-4513-a217-c885a3490f13\") " pod="openshift-machine-config-operator/machine-config-server-nfhcd" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.590330 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3c343416-13c0-4513-a217-c885a3490f13-certs\") pod \"machine-config-server-nfhcd\" (UID: \"3c343416-13c0-4513-a217-c885a3490f13\") " pod="openshift-machine-config-operator/machine-config-server-nfhcd" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.595203 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvsc9\" (UniqueName: \"kubernetes.io/projected/ef12d86d-5629-4a7c-948c-9229440d073a-kube-api-access-mvsc9\") pod \"service-ca-operator-777779d784-92pms\" (UID: \"ef12d86d-5629-4a7c-948c-9229440d073a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.599309 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8vgs\" (UniqueName: \"kubernetes.io/projected/3c005691-aa02-475b-94fd-f8a16d0f4de5-kube-api-access-l8vgs\") pod \"package-server-manager-789f6589d5-gl4jw\" (UID: \"3c005691-aa02-475b-94fd-f8a16d0f4de5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.602306 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.629150 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llzxz\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-kube-api-access-llzxz\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.648124 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhpl9\" (UniqueName: \"kubernetes.io/projected/49540bde-3719-4e5d-acf8-ee877d99f581-kube-api-access-lhpl9\") pod \"machine-config-controller-84d6567774-4ztb9\" (UID: \"49540bde-3719-4e5d-acf8-ee877d99f581\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.654178 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: E0319 16:41:42.654780 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:43.154766312 +0000 UTC m=+115.276965560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.675908 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7729m\" (UniqueName: \"kubernetes.io/projected/27c10200-5439-42ac-adfe-f5ab8de8b93f-kube-api-access-7729m\") pod \"olm-operator-6b444d44fb-bsb9j\" (UID: \"27c10200-5439-42ac-adfe-f5ab8de8b93f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.678781 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgnqk\" (UniqueName: \"kubernetes.io/projected/f5b832a4-7fec-4d2f-a400-1e890bb551b4-kube-api-access-sgnqk\") pod \"marketplace-operator-79b997595-k4qc7\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.700543 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.718210 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8cxp\" (UniqueName: \"kubernetes.io/projected/e623c8c8-da7e-4af5-aae6-95bf127c57d7-kube-api-access-w8cxp\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbzd2\" (UID: \"e623c8c8-da7e-4af5-aae6-95bf127c57d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.729444 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fea5d2be-19d3-4636-b55f-1254a484f36a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-24fgt\" (UID: \"fea5d2be-19d3-4636-b55f-1254a484f36a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.755755 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:42 crc kubenswrapper[4918]: E0319 16:41:42.756228 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:43.256210555 +0000 UTC m=+115.378409793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.758460 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlv2z\" (UniqueName: \"kubernetes.io/projected/ca6c2a92-4376-4b9b-9c73-c29ee0d09082-kube-api-access-hlv2z\") pod \"router-default-5444994796-4rm5n\" (UID: \"ca6c2a92-4376-4b9b-9c73-c29ee0d09082\") " pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.764311 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhfk8\" (UniqueName: \"kubernetes.io/projected/9ab8ff58-9592-4b8c-8ca4-9b08b092e785-kube-api-access-hhfk8\") pod \"ingress-canary-fcbfm\" (UID: \"9ab8ff58-9592-4b8c-8ca4-9b08b092e785\") " pod="openshift-ingress-canary/ingress-canary-fcbfm" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.766831 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.775932 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.790663 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxrx4\" (UniqueName: \"kubernetes.io/projected/ccf81a8c-f372-4162-81a1-a3df96e88bf1-kube-api-access-mxrx4\") pod \"multus-admission-controller-857f4d67dd-w2fmz\" (UID: \"ccf81a8c-f372-4162-81a1-a3df96e88bf1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.839032 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.841813 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.854724 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gqb2\" (UniqueName: \"kubernetes.io/projected/e7c2e54b-e056-4df4-b565-45f068f9f8da-kube-api-access-6gqb2\") pod \"machine-config-operator-74547568cd-9ft8l\" (UID: \"e7c2e54b-e056-4df4-b565-45f068f9f8da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.857871 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.857998 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z55k4\" (UniqueName: \"kubernetes.io/projected/7de76665-62ca-42ca-94ad-537d7789d1a1-kube-api-access-z55k4\") pod \"route-controller-manager-6576b87f9c-tvdlw\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: E0319 16:41:42.858407 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:43.35838781 +0000 UTC m=+115.480587058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.859490 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3964a146-319a-4095-aec8-9469ea55705d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-q8kcw\" (UID: \"3964a146-319a-4095-aec8-9469ea55705d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.861487 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.862289 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89j9d\" (UniqueName: \"kubernetes.io/projected/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-kube-api-access-89j9d\") pod \"collect-profiles-29565630-wz4vn\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.868085 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.870023 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.879172 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.886275 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.898163 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr7g2\" (UniqueName: \"kubernetes.io/projected/517f8bf9-4476-4ca5-a28d-09c375b891fd-kube-api-access-zr7g2\") pod \"machine-approver-56656f9798-n26b6\" (UID: \"517f8bf9-4476-4ca5-a28d-09c375b891fd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.910694 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fcbfm" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.924904 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.926132 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf8kz\" (UniqueName: \"kubernetes.io/projected/6ae59c41-bf25-4189-8a98-5e920f67a6ad-kube-api-access-xf8kz\") pod \"service-ca-9c57cc56f-clcn2\" (UID: \"6ae59c41-bf25-4189-8a98-5e920f67a6ad\") " pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.930986 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.944834 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v266d\" (UniqueName: \"kubernetes.io/projected/3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f-kube-api-access-v266d\") pod \"csi-hostpathplugin-jl5dc\" (UID: \"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f\") " pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.962327 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:42 crc kubenswrapper[4918]: E0319 16:41:42.962901 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:43.462880172 +0000 UTC m=+115.585079420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.971636 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8xgh\" (UniqueName: \"kubernetes.io/projected/141bb3c3-ea35-4df5-8951-2c6423d792cc-kube-api-access-g8xgh\") pod \"dns-default-jb6n7\" (UID: \"141bb3c3-ea35-4df5-8951-2c6423d792cc\") " pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.972052 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.978667 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.978672 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pz8dd"] Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.987029 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" Mar 19 16:41:42 crc kubenswrapper[4918]: I0319 16:41:42.988053 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-td7k5"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:42.994778 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjj2g\" (UniqueName: \"kubernetes.io/projected/3c343416-13c0-4513-a217-c885a3490f13-kube-api-access-cjj2g\") pod \"machine-config-server-nfhcd\" (UID: \"3c343416-13c0-4513-a217-c885a3490f13\") " pod="openshift-machine-config-operator/machine-config-server-nfhcd" Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:42.995468 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:42.997360 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.002857 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pddg\" (UniqueName: \"kubernetes.io/projected/6756a1df-386b-4ee8-954b-bb4ae3829e58-kube-api-access-8pddg\") pod \"cni-sysctl-allowlist-ds-cvm4l\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.070345 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:43 crc kubenswrapper[4918]: E0319 16:41:43.072106 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:43.572091704 +0000 UTC m=+115.694290952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.084665 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.120358 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.135416 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h9xcq"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.144601 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bvkm6"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.150204 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.156156 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84tp9"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.173462 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:43 crc kubenswrapper[4918]: E0319 16:41:43.174093 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:43.674062653 +0000 UTC m=+115.796261901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.178776 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.245268 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nfhcd" Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.268788 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-td7k5" event={"ID":"5189c318-e4b1-4dd9-9a6d-284425d319cf","Type":"ContainerStarted","Data":"2bf9d91ff354780f10ca10b51e4c46dd4d967e6d76a889c72819f102e01d42da"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.275444 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:43 crc kubenswrapper[4918]: E0319 16:41:43.275890 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:43.775875156 +0000 UTC m=+115.898074404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.288613 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.293016 4918 generic.go:334] "Generic (PLEG): container finished" podID="42099723-6874-4d2d-a1ee-e7fd8db3f66c" containerID="fff8a6f05349f36412e1ce8698e1f782c0d09761d306a02b863896f0cf9f6046" exitCode=0 Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.293149 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" event={"ID":"42099723-6874-4d2d-a1ee-e7fd8db3f66c","Type":"ContainerDied","Data":"fff8a6f05349f36412e1ce8698e1f782c0d09761d306a02b863896f0cf9f6046"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.293181 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" event={"ID":"42099723-6874-4d2d-a1ee-e7fd8db3f66c","Type":"ContainerStarted","Data":"6da2bdea863eac68b47bc3ae9c891745ad81157b0e500d6b04834061ca046538"} Mar 19 16:41:43 crc kubenswrapper[4918]: W0319 16:41:43.296141 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda440b991_4ffb_4d2e_aa90_fa5e731d9cff.slice/crio-8496291a210d8b5579396db2d0ba9dc5a2954de78041cc87efeb0afae8940b16 WatchSource:0}: Error finding container 8496291a210d8b5579396db2d0ba9dc5a2954de78041cc87efeb0afae8940b16: Status 404 returned error can't find the container with id 8496291a210d8b5579396db2d0ba9dc5a2954de78041cc87efeb0afae8940b16 Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.305538 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.305912 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" event={"ID":"26c329d7-2138-4f3b-81cd-4b8c0a595a27","Type":"ContainerStarted","Data":"782b4f943de3895a6fcac7fc30ed2963467226e74a66120c0e329d185d492e02"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.305981 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" event={"ID":"26c329d7-2138-4f3b-81cd-4b8c0a595a27","Type":"ContainerStarted","Data":"cfc02d476f99b2212cc6dd19492cd3bfbc5fa452fa824d711de59435eb021b2e"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.337632 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.366737 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" event={"ID":"4f9fa2d8-df01-417a-9170-d1e8288e4111","Type":"ContainerStarted","Data":"b40dc74a937ce6ff558786166a7033188675080a0324248132d38d5755069656"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.366793 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" event={"ID":"4f9fa2d8-df01-417a-9170-d1e8288e4111","Type":"ContainerStarted","Data":"fd44986cb3a47f766b597ac8db5c1c600ea1ef05ba33fd4fcca1edd9c61876d9"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.381944 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mnwrv"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.382086 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:43 crc kubenswrapper[4918]: E0319 16:41:43.382912 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:43.882884023 +0000 UTC m=+116.005083271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:43 crc kubenswrapper[4918]: W0319 16:41:43.386083 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc490d1f_5ebc_43f0_bb00_7c7c0355a8cf.slice/crio-a9b0314ee8cae05111899e7eb568707b7de40930337b0d5f1bdc6e9ac415adb8 WatchSource:0}: Error finding container a9b0314ee8cae05111899e7eb568707b7de40930337b0d5f1bdc6e9ac415adb8: Status 404 returned error can't find the container with id a9b0314ee8cae05111899e7eb568707b7de40930337b0d5f1bdc6e9ac415adb8 Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.390035 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.394025 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.453931 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" event={"ID":"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9","Type":"ContainerStarted","Data":"4a6b0ddf1a2044e78eb49272d4c1072f0e76af7a65b29d51bbfce6532c965fba"} Mar 19 16:41:43 crc kubenswrapper[4918]: W0319 16:41:43.470280 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e8d955e_01e0_4fe0_a713_20f4e83f8cca.slice/crio-ec9622c043590398793620c5c55d0458277e468e32e5d5863882b5b2f491e0d9 WatchSource:0}: Error finding container ec9622c043590398793620c5c55d0458277e468e32e5d5863882b5b2f491e0d9: Status 404 returned error can't find the container with id ec9622c043590398793620c5c55d0458277e468e32e5d5863882b5b2f491e0d9 Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.485025 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:43 crc kubenswrapper[4918]: E0319 16:41:43.501018 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.000963359 +0000 UTC m=+116.123162607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.508519 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-4rm5n" event={"ID":"ca6c2a92-4376-4b9b-9c73-c29ee0d09082","Type":"ContainerStarted","Data":"1b9124b4d97e1cc4105bca745c2710ff8619adaf182bed7e4a03a9a7e49fe229"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.529783 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" event={"ID":"22e4c597-7103-43b5-a54b-4a0cf131a749","Type":"ContainerStarted","Data":"8cac0230a33e0c774b518c443146519d88827c18e9652243a95d5ae73c7377bd"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.530300 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" event={"ID":"22e4c597-7103-43b5-a54b-4a0cf131a749","Type":"ContainerStarted","Data":"d3a5120d08e2dc63752cd61350685ff7de1042bc70e7a3747fcf4431e53aa2aa"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.532105 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-pz4gm" event={"ID":"3d68da4e-0dc1-4835-99f3-a5703db9288e","Type":"ContainerStarted","Data":"0f02971c48ceab080f8bdee8b76e88944c6c17096156e45497fa4fff3f18808e"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.532173 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-pz4gm" event={"ID":"3d68da4e-0dc1-4835-99f3-a5703db9288e","Type":"ContainerStarted","Data":"574664858c4a0bdaed0918a9096d35623318edc00e84394097e64e036bf075a2"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.532707 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-pz4gm" Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.534159 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" event={"ID":"d9b7f6a4-5987-4b92-b063-2ddf9ad42074","Type":"ContainerStarted","Data":"67362873a6aed0eaa62cf00db789b95e341389f55d48a8ab732a4234ad3c14c6"} Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.550212 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.552372 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.557605 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-92pms"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.560011 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.576654 4918 patch_prober.go:28] interesting pod/downloads-7954f5f757-pz4gm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.576735 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-pz4gm" podUID="3d68da4e-0dc1-4835-99f3-a5703db9288e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.587792 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:43 crc kubenswrapper[4918]: E0319 16:41:43.588262 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.088239722 +0000 UTC m=+116.210438970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.639075 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.641740 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5c6nk"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.666607 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.689099 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:43 crc kubenswrapper[4918]: E0319 16:41:43.691198 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.191183579 +0000 UTC m=+116.313382827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:43 crc kubenswrapper[4918]: W0319 16:41:43.724590 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6756a1df_386b_4ee8_954b_bb4ae3829e58.slice/crio-21a278c8bef63c30020ecc29e2fc332eded175452bc54840e988f3e75da4b49d WatchSource:0}: Error finding container 21a278c8bef63c30020ecc29e2fc332eded175452bc54840e988f3e75da4b49d: Status 404 returned error can't find the container with id 21a278c8bef63c30020ecc29e2fc332eded175452bc54840e988f3e75da4b49d Mar 19 16:41:43 crc kubenswrapper[4918]: W0319 16:41:43.741094 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode623c8c8_da7e_4af5_aae6_95bf127c57d7.slice/crio-4d8db72c649ddfce0589901bd4b65e344206e5714e1d10a2bf4f6a358c4f1d1f WatchSource:0}: Error finding container 4d8db72c649ddfce0589901bd4b65e344206e5714e1d10a2bf4f6a358c4f1d1f: Status 404 returned error can't find the container with id 4d8db72c649ddfce0589901bd4b65e344206e5714e1d10a2bf4f6a358c4f1d1f Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.757129 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.760822 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k4qc7"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.760892 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-clcn2"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.768488 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.778184 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-w2fmz"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.789787 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:43 crc kubenswrapper[4918]: E0319 16:41:43.790386 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.290360633 +0000 UTC m=+116.412559881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:43 crc kubenswrapper[4918]: W0319 16:41:43.827273 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c005691_aa02_475b_94fd_f8a16d0f4de5.slice/crio-d3a3b9334161384f296a4e8b294bab9edeafd806bc6d0c8fed325b707fb4ab81 WatchSource:0}: Error finding container d3a3b9334161384f296a4e8b294bab9edeafd806bc6d0c8fed325b707fb4ab81: Status 404 returned error can't find the container with id d3a3b9334161384f296a4e8b294bab9edeafd806bc6d0c8fed325b707fb4ab81 Mar 19 16:41:43 crc kubenswrapper[4918]: W0319 16:41:43.887952 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ae59c41_bf25_4189_8a98_5e920f67a6ad.slice/crio-41ad143aba139f8c7b5a5de5a8d2ec5ad6f157cf949097404edb8e323a889d51 WatchSource:0}: Error finding container 41ad143aba139f8c7b5a5de5a8d2ec5ad6f157cf949097404edb8e323a889d51: Status 404 returned error can't find the container with id 41ad143aba139f8c7b5a5de5a8d2ec5ad6f157cf949097404edb8e323a889d51 Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.892754 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:43 crc kubenswrapper[4918]: E0319 16:41:43.893245 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.393228799 +0000 UTC m=+116.515428047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.934327 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fcbfm"] Mar 19 16:41:43 crc kubenswrapper[4918]: I0319 16:41:43.986030 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9fk82" podStartSLOduration=46.986001994 podStartE2EDuration="46.986001994s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:43.948572496 +0000 UTC m=+116.070771744" watchObservedRunningTime="2026-03-19 16:41:43.986001994 +0000 UTC m=+116.108201242" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.015951 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.016334 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.516315532 +0000 UTC m=+116.638514780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.025749 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw"] Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.030989 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jb6n7"] Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.033147 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn"] Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.035496 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw"] Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.117790 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.118235 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.618218788 +0000 UTC m=+116.740418046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.123920 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jl5dc"] Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.137093 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9"] Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.145230 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l"] Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.219077 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.219435 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.719398893 +0000 UTC m=+116.841598151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.220050 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.220403 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.720388052 +0000 UTC m=+116.842587290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.221008 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-gbfq6" podStartSLOduration=48.22098698 podStartE2EDuration="48.22098698s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:44.179535832 +0000 UTC m=+116.301735080" watchObservedRunningTime="2026-03-19 16:41:44.22098698 +0000 UTC m=+116.343186218" Mar 19 16:41:44 crc kubenswrapper[4918]: W0319 16:41:44.226739 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa71e1c7_4bbd_4a5d_b07c_c9f232bc4ae0.slice/crio-a9897dabd415fe600f130cd516fe6ea7a0b7059d1bb6b99b7926a24742725fa2 WatchSource:0}: Error finding container a9897dabd415fe600f130cd516fe6ea7a0b7059d1bb6b99b7926a24742725fa2: Status 404 returned error can't find the container with id a9897dabd415fe600f130cd516fe6ea7a0b7059d1bb6b99b7926a24742725fa2 Mar 19 16:41:44 crc kubenswrapper[4918]: W0319 16:41:44.240834 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod141bb3c3_ea35_4df5_8951_2c6423d792cc.slice/crio-44a8bf709d6006f582c7fa2a1d7f8cc966567670cced2f51a42ff88c778c5367 WatchSource:0}: Error finding container 44a8bf709d6006f582c7fa2a1d7f8cc966567670cced2f51a42ff88c778c5367: Status 404 returned error can't find the container with id 44a8bf709d6006f582c7fa2a1d7f8cc966567670cced2f51a42ff88c778c5367 Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.320733 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.320925 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.820896206 +0000 UTC m=+116.943095444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.321086 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.321148 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs\") pod \"network-metrics-daemon-qcgd2\" (UID: \"69770981-c309-4aa4-ba5a-29bf78372aae\") " pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.321668 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.821654889 +0000 UTC m=+116.943854137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.329015 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69770981-c309-4aa4-ba5a-29bf78372aae-metrics-certs\") pod \"network-metrics-daemon-qcgd2\" (UID: \"69770981-c309-4aa4-ba5a-29bf78372aae\") " pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.337688 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-pz4gm" podStartSLOduration=48.337667593 podStartE2EDuration="48.337667593s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:44.335615122 +0000 UTC m=+116.457814370" watchObservedRunningTime="2026-03-19 16:41:44.337667593 +0000 UTC m=+116.459866841" Mar 19 16:41:44 crc kubenswrapper[4918]: W0319 16:41:44.339457 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7c2e54b_e056_4df4_b565_45f068f9f8da.slice/crio-b965c2fc4df3a33b4616b802e4c7616f88a5e32082fed79c58f441fe92c5df4b WatchSource:0}: Error finding container b965c2fc4df3a33b4616b802e4c7616f88a5e32082fed79c58f441fe92c5df4b: Status 404 returned error can't find the container with id b965c2fc4df3a33b4616b802e4c7616f88a5e32082fed79c58f441fe92c5df4b Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.422683 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.422813 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.922789802 +0000 UTC m=+117.044989050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.423663 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.424070 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:44.9240575 +0000 UTC m=+117.046256748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.527223 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.527360 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:45.027326427 +0000 UTC m=+117.149525665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.527466 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.528000 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:45.027986286 +0000 UTC m=+117.150185534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.535834 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcgd2" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.558020 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" event={"ID":"e623c8c8-da7e-4af5-aae6-95bf127c57d7","Type":"ContainerStarted","Data":"4d8db72c649ddfce0589901bd4b65e344206e5714e1d10a2bf4f6a358c4f1d1f"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.572836 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" event={"ID":"3964a146-319a-4095-aec8-9469ea55705d","Type":"ContainerStarted","Data":"8b01305721cfaf6977663953a312ab658dfbb9ee238b2394fcd9451313d292d7"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.629394 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" event={"ID":"f5b832a4-7fec-4d2f-a400-1e890bb551b4","Type":"ContainerStarted","Data":"a5948eeb7e6d17622dbae1d313e27afd9cdb3cf2cf44c3641ef49c328e1daacd"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.629471 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" event={"ID":"f0a8c78b-296b-4baa-91f2-bb9696fdc134","Type":"ContainerStarted","Data":"350d7ad706f0cf7d8ea3a15c15cf865c2b355e293bf23c8d5765bce6ca8a66ff"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.629501 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" event={"ID":"f0a8c78b-296b-4baa-91f2-bb9696fdc134","Type":"ContainerStarted","Data":"7a9bed7d5ebf5de8e33ffdd895e04f17b7c7f42baddd88119f5a6ff84ee17742"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.630958 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.631264 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:45.131237612 +0000 UTC m=+117.253436920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.641439 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" event={"ID":"ef184530-a1ee-415c-a683-0588bf7f3ffb","Type":"ContainerStarted","Data":"f6c82dd2a3bd4cd56dce3d2098e7534abf327b7be4eba2b7f633821e85e2248e"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.642800 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.648592 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" event={"ID":"f2bedb86-27a0-40a4-a97c-10f1f287fc00","Type":"ContainerStarted","Data":"3623c52e878750369de659500fb960611b4f32d47d5192591566f93c42ffd008"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.650777 4918 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-84tp9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.650826 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" podUID="ef184530-a1ee-415c-a683-0588bf7f3ffb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.659829 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" event={"ID":"e7c2e54b-e056-4df4-b565-45f068f9f8da","Type":"ContainerStarted","Data":"b965c2fc4df3a33b4616b802e4c7616f88a5e32082fed79c58f441fe92c5df4b"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.667380 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" event={"ID":"517f8bf9-4476-4ca5-a28d-09c375b891fd","Type":"ContainerStarted","Data":"184036ed64978800c2b232fe037ea8b2c330aba815943c513dacb6518468b1b7"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.667438 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" event={"ID":"517f8bf9-4476-4ca5-a28d-09c375b891fd","Type":"ContainerStarted","Data":"555a7e8d2765384bed5fc684781a01b46f66517bfafd8d46e266af0df23fa7cb"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.670545 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hkhgj" podStartSLOduration=48.670428492 podStartE2EDuration="48.670428492s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:44.66867757 +0000 UTC m=+116.790876828" watchObservedRunningTime="2026-03-19 16:41:44.670428492 +0000 UTC m=+116.792627740" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.672335 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" event={"ID":"117e7cf2-e68f-423d-a312-f1d63c3b815b","Type":"ContainerStarted","Data":"bb089f09700fe6961655e987761bab423a0d7008879b6c68ba601195a5ae55c9"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.705114 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" podStartSLOduration=47.705084618 podStartE2EDuration="47.705084618s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:44.701500341 +0000 UTC m=+116.823699589" watchObservedRunningTime="2026-03-19 16:41:44.705084618 +0000 UTC m=+116.827283856" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.721015 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" event={"ID":"27c10200-5439-42ac-adfe-f5ab8de8b93f","Type":"ContainerStarted","Data":"9b56c81170af43a0f4d5577930ff1b3abcf6d9168de64dc818977c30cc8fcee8"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.721234 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.722600 4918 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-bsb9j container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.722656 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" podUID="27c10200-5439-42ac-adfe-f5ab8de8b93f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.730115 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" event={"ID":"fea5d2be-19d3-4636-b55f-1254a484f36a","Type":"ContainerStarted","Data":"5c50e3997000d62903ef7b7d576709256c88af121cea5cda63cd7ad4897a0b8f"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.731675 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" event={"ID":"6ae59c41-bf25-4189-8a98-5e920f67a6ad","Type":"ContainerStarted","Data":"41ad143aba139f8c7b5a5de5a8d2ec5ad6f157cf949097404edb8e323a889d51"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.737549 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.739253 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:45.239235988 +0000 UTC m=+117.361435236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.763450 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" event={"ID":"d9b7f6a4-5987-4b92-b063-2ddf9ad42074","Type":"ContainerStarted","Data":"23aba47af7908f228f45bb55415255e2099deee5313b52599ff80cda50a39e59"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.774337 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" podStartSLOduration=47.774303686 podStartE2EDuration="47.774303686s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:44.755286284 +0000 UTC m=+116.877485532" watchObservedRunningTime="2026-03-19 16:41:44.774303686 +0000 UTC m=+116.896502934" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.778452 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" event={"ID":"ccf81a8c-f372-4162-81a1-a3df96e88bf1","Type":"ContainerStarted","Data":"677d859471af04b9f8da69555a05be98f91c6d24a4c238de84d84693d6d53df4"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.791658 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" event={"ID":"49540bde-3719-4e5d-acf8-ee877d99f581","Type":"ContainerStarted","Data":"0f0a81086099240e9829ba5fd4e9ffbeba58ef1099d28138fb39e55d34230467"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.795737 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" event={"ID":"a440b991-4ffb-4d2e-aa90-fa5e731d9cff","Type":"ContainerStarted","Data":"8496291a210d8b5579396db2d0ba9dc5a2954de78041cc87efeb0afae8940b16"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.806602 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fcbfm" event={"ID":"9ab8ff58-9592-4b8c-8ca4-9b08b092e785","Type":"ContainerStarted","Data":"b79768259bbced8c941d950b90f41224072fa29d07391f4077386c71bc725ce8"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.817354 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" podStartSLOduration=47.81733884 podStartE2EDuration="47.81733884s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:44.815798015 +0000 UTC m=+116.937997263" watchObservedRunningTime="2026-03-19 16:41:44.81733884 +0000 UTC m=+116.939538088" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.836069 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nfhcd" event={"ID":"3c343416-13c0-4513-a217-c885a3490f13","Type":"ContainerStarted","Data":"a721a7832dc0860643c8a2e9e0dc71393dbb90b4b75c76c8e0171d865f7a8d06"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.838714 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.842160 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:45.342125474 +0000 UTC m=+117.464324772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.852261 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5c6nk" event={"ID":"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc","Type":"ContainerStarted","Data":"35a0e8bcacaf666d31665d556980521509b4c75ba0cf7e45614487c5e4b9c03d"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.853001 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.853916 4918 patch_prober.go:28] interesting pod/console-operator-58897d9998-5c6nk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.853957 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5c6nk" podUID="8848d8b1-3fa3-4d27-b9d7-803e2a884bfc" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.921256 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" event={"ID":"26c329d7-2138-4f3b-81cd-4b8c0a595a27","Type":"ContainerStarted","Data":"f3ff4456bf8f1633f4562ba9959ac07b68b98c0d012d398db3b9b7f6c0f1f348"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.937711 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-4rm5n" event={"ID":"ca6c2a92-4376-4b9b-9c73-c29ee0d09082","Type":"ContainerStarted","Data":"f5ff0d68fe6421a5632753b6c1b2d3811a3f10e6717030be343b9606c63762d9"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.946401 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:44 crc kubenswrapper[4918]: E0319 16:41:44.946788 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:45.446775331 +0000 UTC m=+117.568974579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.960689 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt" event={"ID":"7a6ce6f1-2aa5-44d7-b3f2-3b062ee500c0","Type":"ContainerStarted","Data":"1aa2d2bc22899ba58a700823791555c67cb0c29ffc9b9b42e2849f6c598e16ca"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.982360 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-5c6nk" podStartSLOduration=48.982339484 podStartE2EDuration="48.982339484s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:44.906082447 +0000 UTC m=+117.028281695" watchObservedRunningTime="2026-03-19 16:41:44.982339484 +0000 UTC m=+117.104538732" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.982974 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mvf4s" podStartSLOduration=47.982967342 podStartE2EDuration="47.982967342s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:44.972118291 +0000 UTC m=+117.094317529" watchObservedRunningTime="2026-03-19 16:41:44.982967342 +0000 UTC m=+117.105166600" Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.989255 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" event={"ID":"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf","Type":"ContainerStarted","Data":"dff8b7d77af83cd42e7ed38b98972571549826cc6d361c5dd353ebd583c65c87"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.989330 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" event={"ID":"dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf","Type":"ContainerStarted","Data":"a9b0314ee8cae05111899e7eb568707b7de40930337b0d5f1bdc6e9ac415adb8"} Mar 19 16:41:44 crc kubenswrapper[4918]: I0319 16:41:44.990659 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.034179 4918 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vtw7s container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.034265 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" podUID="dc490d1f-5ebc-43f0-bb00-7c7c0355a8cf" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.035413 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" event={"ID":"42099723-6874-4d2d-a1ee-e7fd8db3f66c","Type":"ContainerStarted","Data":"f420def6a81289db3518b3a69905622b4fff189a5b4431f4cfe9fe299820303d"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.035458 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.052640 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:45 crc kubenswrapper[4918]: E0319 16:41:45.054061 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:45.554038506 +0000 UTC m=+117.676237754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.057672 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" event={"ID":"3c005691-aa02-475b-94fd-f8a16d0f4de5","Type":"ContainerStarted","Data":"d3a3b9334161384f296a4e8b294bab9edeafd806bc6d0c8fed325b707fb4ab81"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.077724 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jb6n7" event={"ID":"141bb3c3-ea35-4df5-8951-2c6423d792cc","Type":"ContainerStarted","Data":"44a8bf709d6006f582c7fa2a1d7f8cc966567670cced2f51a42ff88c778c5367"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.090328 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" podStartSLOduration=48.09031211 podStartE2EDuration="48.09031211s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:45.083329543 +0000 UTC m=+117.205528791" watchObservedRunningTime="2026-03-19 16:41:45.09031211 +0000 UTC m=+117.212511358" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.091263 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-4rm5n" podStartSLOduration=48.091259608 podStartE2EDuration="48.091259608s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:45.029081487 +0000 UTC m=+117.151280745" watchObservedRunningTime="2026-03-19 16:41:45.091259608 +0000 UTC m=+117.213458856" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.091990 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" event={"ID":"7d02d944-f817-4258-b5e9-70dae96b646d","Type":"ContainerStarted","Data":"dd3d9812140130986a6552c4c0f3b5bfc42464be50a8d1b6f6a2630331e3055a"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.118572 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" event={"ID":"ef12d86d-5629-4a7c-948c-9229440d073a","Type":"ContainerStarted","Data":"23bb4f5b380e985b215d753ecc1ed24041b250630b24eef23b852f799aef8ef1"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.118631 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" event={"ID":"ef12d86d-5629-4a7c-948c-9229440d073a","Type":"ContainerStarted","Data":"27a4673be17f4b46e7127b397583a7b022a470e8b4c11de720d2f09280f0f4f3"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.133118 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" podStartSLOduration=49.133097026 podStartE2EDuration="49.133097026s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:45.12815344 +0000 UTC m=+117.250352698" watchObservedRunningTime="2026-03-19 16:41:45.133097026 +0000 UTC m=+117.255296274" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.155467 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:45 crc kubenswrapper[4918]: E0319 16:41:45.155822 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:45.655809399 +0000 UTC m=+117.778008647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.193711 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl" event={"ID":"9ff332c4-5e3a-4d2d-a694-870559724211","Type":"ContainerStarted","Data":"2d169e0b428ce0be8e69e6a88d8b3f6db86448f2b01b4d96a61a0d55be0317e1"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.249102 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" event={"ID":"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9","Type":"ContainerStarted","Data":"bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.250133 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.256973 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:45 crc kubenswrapper[4918]: E0319 16:41:45.258488 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:45.758462346 +0000 UTC m=+117.880661594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.259863 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-92pms" podStartSLOduration=48.259825957 podStartE2EDuration="48.259825957s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:45.153242542 +0000 UTC m=+117.275441790" watchObservedRunningTime="2026-03-19 16:41:45.259825957 +0000 UTC m=+117.382025205" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.261253 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qcgd2"] Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.287791 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" podStartSLOduration=49.287772754 podStartE2EDuration="49.287772754s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:45.286100155 +0000 UTC m=+117.408299403" watchObservedRunningTime="2026-03-19 16:41:45.287772754 +0000 UTC m=+117.409972002" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.292821 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-td7k5" event={"ID":"5189c318-e4b1-4dd9-9a6d-284425d319cf","Type":"ContainerStarted","Data":"83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.320083 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" event={"ID":"fab39d74-56e2-462a-858d-a255438e06ef","Type":"ContainerStarted","Data":"e4671a2c475f8da8cdf856923d84b479edb66ab4ca670f9275845efe84ee62cc"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.320317 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.321128 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-td7k5" podStartSLOduration=49.321105761 podStartE2EDuration="49.321105761s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:45.32007498 +0000 UTC m=+117.442274238" watchObservedRunningTime="2026-03-19 16:41:45.321105761 +0000 UTC m=+117.443305009" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.337287 4918 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-f8mlp container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.337366 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" podUID="fab39d74-56e2-462a-858d-a255438e06ef" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.337570 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" event={"ID":"4ba57cc0-b212-4e46-a3b6-98fba822c17d","Type":"ContainerStarted","Data":"43026a2f5668cc48dcf44262f34a609a1b7be7f031796d6b2d8d7d481a481f01"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.352196 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" podStartSLOduration=48.35217705 podStartE2EDuration="48.35217705s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:45.349710007 +0000 UTC m=+117.471909255" watchObservedRunningTime="2026-03-19 16:41:45.35217705 +0000 UTC m=+117.474376298" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.353069 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" event={"ID":"9e8d955e-01e0-4fe0-a713-20f4e83f8cca","Type":"ContainerStarted","Data":"ec9622c043590398793620c5c55d0458277e468e32e5d5863882b5b2f491e0d9"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.358383 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:45 crc kubenswrapper[4918]: E0319 16:41:45.384745 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:45.884719153 +0000 UTC m=+118.006918401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.398059 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" event={"ID":"6756a1df-386b-4ee8-954b-bb4ae3829e58","Type":"ContainerStarted","Data":"21a278c8bef63c30020ecc29e2fc332eded175452bc54840e988f3e75da4b49d"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.402771 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.415552 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" event={"ID":"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0","Type":"ContainerStarted","Data":"a9897dabd415fe600f130cd516fe6ea7a0b7059d1bb6b99b7926a24742725fa2"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.423111 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84tp9"] Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.433466 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" event={"ID":"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f","Type":"ContainerStarted","Data":"b84e021d71405d7768267cba02d62b0e2d0841c48e62235f1d9a18601cd09a73"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.447614 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw"] Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.447705 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" event={"ID":"bb114c91-d63c-4b6e-927f-cb68c2dcf04f","Type":"ContainerStarted","Data":"23f5cbd11fb395fb5495c2320d46a43b0907ac4879a325cb596b58453b178de4"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.448542 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" event={"ID":"7de76665-62ca-42ca-94ad-537d7789d1a1","Type":"ContainerStarted","Data":"ee8d8d074113fcb382e551ee68a15450cdb148bc818a9686e41d5a91ea3d188a"} Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.450001 4918 patch_prober.go:28] interesting pod/downloads-7954f5f757-pz4gm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.450051 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-pz4gm" podUID="3d68da4e-0dc1-4835-99f3-a5703db9288e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.469013 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:45 crc kubenswrapper[4918]: E0319 16:41:45.470606 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:45.970588524 +0000 UTC m=+118.092787762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.510183 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" podStartSLOduration=48.510146726 podStartE2EDuration="48.510146726s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:45.414439783 +0000 UTC m=+117.536639061" watchObservedRunningTime="2026-03-19 16:41:45.510146726 +0000 UTC m=+117.632345974" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.511423 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" podStartSLOduration=6.511416204 podStartE2EDuration="6.511416204s" podCreationTimestamp="2026-03-19 16:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:45.438263428 +0000 UTC m=+117.560462676" watchObservedRunningTime="2026-03-19 16:41:45.511416204 +0000 UTC m=+117.633615452" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.526510 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" podStartSLOduration=49.526486639 podStartE2EDuration="49.526486639s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:45.503761137 +0000 UTC m=+117.625960385" watchObservedRunningTime="2026-03-19 16:41:45.526486639 +0000 UTC m=+117.648685887" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.571638 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:45 crc kubenswrapper[4918]: E0319 16:41:45.579216 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:46.079180489 +0000 UTC m=+118.201379737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.598844 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.675229 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:45 crc kubenswrapper[4918]: E0319 16:41:45.675721 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:46.175700006 +0000 UTC m=+118.297899254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.770418 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.774089 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:45 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:45 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:45 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.774131 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.777637 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:45 crc kubenswrapper[4918]: E0319 16:41:45.778069 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:46.278042465 +0000 UTC m=+118.400241713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.802802 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.878649 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:45 crc kubenswrapper[4918]: E0319 16:41:45.879214 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:46.379192978 +0000 UTC m=+118.501392226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:45 crc kubenswrapper[4918]: I0319 16:41:45.981066 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:45 crc kubenswrapper[4918]: E0319 16:41:45.981537 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:46.481512957 +0000 UTC m=+118.603712205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.082969 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:46 crc kubenswrapper[4918]: E0319 16:41:46.083853 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:46.583821485 +0000 UTC m=+118.706020723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.189355 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:46 crc kubenswrapper[4918]: E0319 16:41:46.189879 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:46.689864854 +0000 UTC m=+118.812064102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.293355 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:46 crc kubenswrapper[4918]: E0319 16:41:46.293774 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:46.793756118 +0000 UTC m=+118.915955366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.394704 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:46 crc kubenswrapper[4918]: E0319 16:41:46.395447 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:46.895433738 +0000 UTC m=+119.017632986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.476091 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-cvm4l"] Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.497269 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:46 crc kubenswrapper[4918]: E0319 16:41:46.497949 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:46.997931822 +0000 UTC m=+119.120131070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.511501 4918 generic.go:334] "Generic (PLEG): container finished" podID="9e8d955e-01e0-4fe0-a713-20f4e83f8cca" containerID="bd8d1164b7cc53a7ddb57aa8ed9c9663ce94656d591634c1532b7544b1391bac" exitCode=0 Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.512085 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" event={"ID":"9e8d955e-01e0-4fe0-a713-20f4e83f8cca","Type":"ContainerDied","Data":"bd8d1164b7cc53a7ddb57aa8ed9c9663ce94656d591634c1532b7544b1391bac"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.552666 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" event={"ID":"27c10200-5439-42ac-adfe-f5ab8de8b93f","Type":"ContainerStarted","Data":"f33e702e16cfde69db39770a65a0171744165a11b9ef1dd41fca1c49b2ead33c"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.577757 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt" event={"ID":"7a6ce6f1-2aa5-44d7-b3f2-3b062ee500c0","Type":"ContainerStarted","Data":"4b76d0f2519aeb95f9fc2ff95e1e41d77e5fce164aa52aed692fdd29f98a0f02"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.577807 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt" event={"ID":"7a6ce6f1-2aa5-44d7-b3f2-3b062ee500c0","Type":"ContainerStarted","Data":"36b3f221367355e64bc1589d72597ffac6c60f8387d9513f89d260091aee0e0d"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.578087 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bsb9j" Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.604419 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:46 crc kubenswrapper[4918]: E0319 16:41:46.604903 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:47.104883628 +0000 UTC m=+119.227082876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.629128 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" event={"ID":"7d02d944-f817-4258-b5e9-70dae96b646d","Type":"ContainerStarted","Data":"c3f400f35861e76ea0cb08c28f08dded4f731f9901e9b0aec0dc7264190be8b7"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.629185 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" event={"ID":"6ae59c41-bf25-4189-8a98-5e920f67a6ad","Type":"ContainerStarted","Data":"2701f63c218641edacf371d287be8ffa2829f343d16ea0ed39912d91af214e4c"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.634826 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nfhcd" event={"ID":"3c343416-13c0-4513-a217-c885a3490f13","Type":"ContainerStarted","Data":"0fdbb98bc948eb70ee34cbbadcb1c44eeabe3585d515f14fe4c90d386180c5f4"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.656824 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" event={"ID":"ef184530-a1ee-415c-a683-0588bf7f3ffb","Type":"ContainerStarted","Data":"9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.679119 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" event={"ID":"517f8bf9-4476-4ca5-a28d-09c375b891fd","Type":"ContainerStarted","Data":"108e5c944aee8b5986392652c1b639a194315ee0324e2b3106eb0ad29fc720bc"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.684936 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.705054 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:46 crc kubenswrapper[4918]: E0319 16:41:46.707066 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:47.207048282 +0000 UTC m=+119.329247530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.710870 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wvklt" podStartSLOduration=49.710830663 podStartE2EDuration="49.710830663s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:46.654123655 +0000 UTC m=+118.776322903" watchObservedRunningTime="2026-03-19 16:41:46.710830663 +0000 UTC m=+118.833029911" Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.712285 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-clcn2" podStartSLOduration=49.712276256 podStartE2EDuration="49.712276256s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:46.693764008 +0000 UTC m=+118.815963276" watchObservedRunningTime="2026-03-19 16:41:46.712276256 +0000 UTC m=+118.834475504" Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.783045 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" event={"ID":"fea5d2be-19d3-4636-b55f-1254a484f36a","Type":"ContainerStarted","Data":"7492cfafe2be62c7f0d392c711c126ce6bd35b0b1b73b1a3771653a5196256f3"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.787378 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:46 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:46 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:46 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.787481 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.805512 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" event={"ID":"f5b832a4-7fec-4d2f-a400-1e890bb551b4","Type":"ContainerStarted","Data":"3ad2812d254603ca69fe3bee0077005d9620b6a9600d3253e5c2fcb15a4bf12c"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.809858 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.811034 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:46 crc kubenswrapper[4918]: E0319 16:41:46.812258 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:47.312243575 +0000 UTC m=+119.434442823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.844853 4918 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k4qc7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.845446 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" podUID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.863772 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" event={"ID":"ccf81a8c-f372-4162-81a1-a3df96e88bf1","Type":"ContainerStarted","Data":"d541df395ae3018970ae94ce36688669379b49065a49558bed0e02938bd68edd"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.914756 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:46 crc kubenswrapper[4918]: E0319 16:41:46.916214 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:47.416198712 +0000 UTC m=+119.538397950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.940288 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pmg4f" event={"ID":"bb114c91-d63c-4b6e-927f-cb68c2dcf04f","Type":"ContainerStarted","Data":"903cf41f1245ef90de16f2e7f1ea67b46b36ddb23b8865f19e42113b427dd0dc"} Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.944277 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7hntq" podStartSLOduration=49.944264793 podStartE2EDuration="49.944264793s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:46.887601905 +0000 UTC m=+119.009801153" watchObservedRunningTime="2026-03-19 16:41:46.944264793 +0000 UTC m=+119.066464041" Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.971726 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-nfhcd" podStartSLOduration=7.9717088149999995 podStartE2EDuration="7.971708815s" podCreationTimestamp="2026-03-19 16:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:46.944810758 +0000 UTC m=+119.067010006" watchObservedRunningTime="2026-03-19 16:41:46.971708815 +0000 UTC m=+119.093908053" Mar 19 16:41:46 crc kubenswrapper[4918]: I0319 16:41:46.980070 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" event={"ID":"49540bde-3719-4e5d-acf8-ee877d99f581","Type":"ContainerStarted","Data":"885f78964b9223e37af17c56772dfea9ffd334437788cea39a0a8eb79b6cdb78"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.013689 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fcbfm" event={"ID":"9ab8ff58-9592-4b8c-8ca4-9b08b092e785","Type":"ContainerStarted","Data":"7f67e1fb13188e1406fdd56df7a4dd3d86baf784fc4d4c572cd848c65e4aaa26"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.017400 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.019277 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" event={"ID":"6756a1df-386b-4ee8-954b-bb4ae3829e58","Type":"ContainerStarted","Data":"d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93"} Mar 19 16:41:47 crc kubenswrapper[4918]: E0319 16:41:47.019964 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:47.519949033 +0000 UTC m=+119.642148281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.028780 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" podStartSLOduration=50.028737652 podStartE2EDuration="50.028737652s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:46.973452367 +0000 UTC m=+119.095651615" watchObservedRunningTime="2026-03-19 16:41:47.028737652 +0000 UTC m=+119.150936910" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.070368 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qcgd2" event={"ID":"69770981-c309-4aa4-ba5a-29bf78372aae","Type":"ContainerStarted","Data":"e6eac7b347892ab8dae61e7e280dfed09597add5c8efc1d8620f317767d648f2"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.086283 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bvkm6" event={"ID":"a440b991-4ffb-4d2e-aa90-fa5e731d9cff","Type":"ContainerStarted","Data":"40e3c32f7df5a460660fe1cb48b573d8cd7fbc55fbb41771d56a67e17bd525ba"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.097647 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" event={"ID":"e623c8c8-da7e-4af5-aae6-95bf127c57d7","Type":"ContainerStarted","Data":"4c9198f3aa7c48513e01a073f06eabe685254cc922f79406443d20490ad48bbe"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.108400 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n26b6" podStartSLOduration=51.10837271 podStartE2EDuration="51.10837271s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:47.060925525 +0000 UTC m=+119.183124773" watchObservedRunningTime="2026-03-19 16:41:47.10837271 +0000 UTC m=+119.230571958" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.108972 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-24fgt" podStartSLOduration=50.108966518 podStartE2EDuration="50.108966518s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:47.103911348 +0000 UTC m=+119.226110606" watchObservedRunningTime="2026-03-19 16:41:47.108966518 +0000 UTC m=+119.231165766" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.121319 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:47 crc kubenswrapper[4918]: E0319 16:41:47.121556 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:47.621511599 +0000 UTC m=+119.743710847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.121698 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:47 crc kubenswrapper[4918]: E0319 16:41:47.123295 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:47.623283891 +0000 UTC m=+119.745483139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.146028 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" podStartSLOduration=50.145985383 podStartE2EDuration="50.145985383s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:47.145497089 +0000 UTC m=+119.267696337" watchObservedRunningTime="2026-03-19 16:41:47.145985383 +0000 UTC m=+119.268184631" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.166658 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" event={"ID":"e7c2e54b-e056-4df4-b565-45f068f9f8da","Type":"ContainerStarted","Data":"aa082624d1ff7881c212857d32d8ef0c5603c1ed3b52dbe87b07eb9d54ea87d0"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.175493 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" event={"ID":"117e7cf2-e68f-423d-a312-f1d63c3b815b","Type":"ContainerStarted","Data":"d73c73cd2b98cbbb834b80edafb8e43b07c274a0a889dc69d9376ce90255c7dd"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.175558 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" event={"ID":"117e7cf2-e68f-423d-a312-f1d63c3b815b","Type":"ContainerStarted","Data":"e8442ccf90278ff385280d7c8ef2e600886e6bd0525070901dfaf0e2e1233575"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.225098 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:47 crc kubenswrapper[4918]: E0319 16:41:47.225369 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:47.725339222 +0000 UTC m=+119.847538470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.229843 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7zczz" event={"ID":"4ba57cc0-b212-4e46-a3b6-98fba822c17d","Type":"ContainerStarted","Data":"3bab51653eced2aadbc102df5dc8498a7ec2de57ce248bcfed392a8152a93634"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.248453 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbzd2" podStartSLOduration=50.248431675 podStartE2EDuration="50.248431675s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:47.197983522 +0000 UTC m=+119.320182770" watchObservedRunningTime="2026-03-19 16:41:47.248431675 +0000 UTC m=+119.370630923" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.249219 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fcbfm" podStartSLOduration=8.249214539 podStartE2EDuration="8.249214539s" podCreationTimestamp="2026-03-19 16:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:47.242592402 +0000 UTC m=+119.364791650" watchObservedRunningTime="2026-03-19 16:41:47.249214539 +0000 UTC m=+119.371413777" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.264639 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" event={"ID":"3964a146-319a-4095-aec8-9469ea55705d","Type":"ContainerStarted","Data":"ea4d702d71e60d3682faf6d80b1419d32810d91c27bee0f5dc5963d5c5e9c173"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.264713 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" event={"ID":"3964a146-319a-4095-aec8-9469ea55705d","Type":"ContainerStarted","Data":"7f67af5c69bbd420f0e0cb6127138ca1d4563a8b45370b9618ff0717fe1cacf6"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.274824 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbl8z" podStartSLOduration=51.274806676 podStartE2EDuration="51.274806676s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:47.272321492 +0000 UTC m=+119.394520740" watchObservedRunningTime="2026-03-19 16:41:47.274806676 +0000 UTC m=+119.397005924" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.279010 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" event={"ID":"d9b7f6a4-5987-4b92-b063-2ddf9ad42074","Type":"ContainerStarted","Data":"4815b61cbe78aad6d745cca6650b960ebf8eee8326474d7126f4896368509f90"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.299758 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" event={"ID":"7de76665-62ca-42ca-94ad-537d7789d1a1","Type":"ContainerStarted","Data":"70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.299923 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" podUID="7de76665-62ca-42ca-94ad-537d7789d1a1" containerName="route-controller-manager" containerID="cri-o://70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3" gracePeriod=30 Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.300662 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.307976 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl" event={"ID":"9ff332c4-5e3a-4d2d-a694-870559724211","Type":"ContainerStarted","Data":"2fe9935c826f03054842e8a48197b7fe565ad1cc497b68b5bdde95dc9b07f8ca"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.323927 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" podStartSLOduration=50.323903079 podStartE2EDuration="50.323903079s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:47.32326097 +0000 UTC m=+119.445460218" watchObservedRunningTime="2026-03-19 16:41:47.323903079 +0000 UTC m=+119.446102327" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.328318 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.329205 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jb6n7" event={"ID":"141bb3c3-ea35-4df5-8951-2c6423d792cc","Type":"ContainerStarted","Data":"41baf72cef886661f0277c4f69ae4b39ad4c05a62f84b5a4939e22b0ba69e279"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.329749 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:47 crc kubenswrapper[4918]: E0319 16:41:47.329880 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:47.829862515 +0000 UTC m=+119.952061843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.352893 4918 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-tvdlw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": read tcp 10.217.0.2:43806->10.217.0.16:8443: read: connection reset by peer" start-of-body= Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.352959 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" podUID="7de76665-62ca-42ca-94ad-537d7789d1a1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": read tcp 10.217.0.2:43806->10.217.0.16:8443: read: connection reset by peer" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.381457 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" podStartSLOduration=50.381432612 podStartE2EDuration="50.381432612s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:47.377928358 +0000 UTC m=+119.500127606" watchObservedRunningTime="2026-03-19 16:41:47.381432612 +0000 UTC m=+119.503631860" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.391668 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" event={"ID":"fab39d74-56e2-462a-858d-a255438e06ef","Type":"ContainerStarted","Data":"cb4837cb5b5fbcb5824571dd448b9dd50de4f8af8ce7aeca43dcad2b564a57df"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.409505 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f8mlp" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.412532 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.430366 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:47 crc kubenswrapper[4918]: E0319 16:41:47.432369 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:47.932345589 +0000 UTC m=+120.054544837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.437830 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7lqkl" podStartSLOduration=50.437780269 podStartE2EDuration="50.437780269s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:47.433573834 +0000 UTC m=+119.555773082" watchObservedRunningTime="2026-03-19 16:41:47.437780269 +0000 UTC m=+119.559979517" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.439817 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" event={"ID":"3c005691-aa02-475b-94fd-f8a16d0f4de5","Type":"ContainerStarted","Data":"4b55c6af6d9d780531d4bf7dac75c17bf806c3bd725a4da4ee5cc86e5a6b4d35"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.439864 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" event={"ID":"3c005691-aa02-475b-94fd-f8a16d0f4de5","Type":"ContainerStarted","Data":"d574a74541d68475b43fada80e8bf0659e499d04a96a3676110ec72dff5ef152"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.440322 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.462192 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5c6nk" event={"ID":"8848d8b1-3fa3-4d27-b9d7-803e2a884bfc","Type":"ContainerStarted","Data":"ae7a93b388972542508651fcf30ad8b810d2d98299d4c710a24b0a4edc25b369"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.483285 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-5c6nk" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.523139 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" event={"ID":"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0","Type":"ContainerStarted","Data":"d050a608d4cfc3d08f3a991c152c00b6609359ff283e54770c428e3a203f6df1"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.532890 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:47 crc kubenswrapper[4918]: E0319 16:41:47.535152 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:48.035129571 +0000 UTC m=+120.157328819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.550207 4918 generic.go:334] "Generic (PLEG): container finished" podID="f2bedb86-27a0-40a4-a97c-10f1f287fc00" containerID="c6a740859350743c56c37b3a60bd3af77b5f2deedd21ed46dc85b404fab1b118" exitCode=0 Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.552384 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" event={"ID":"f2bedb86-27a0-40a4-a97c-10f1f287fc00","Type":"ContainerDied","Data":"c6a740859350743c56c37b3a60bd3af77b5f2deedd21ed46dc85b404fab1b118"} Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.601081 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8shmz" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.601203 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vtw7s" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.642125 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:47 crc kubenswrapper[4918]: E0319 16:41:47.643451 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:48.143432206 +0000 UTC m=+120.265631454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.744346 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:47 crc kubenswrapper[4918]: E0319 16:41:47.744889 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:48.244871099 +0000 UTC m=+120.367070347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.786306 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:47 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:47 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:47 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.786379 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.846684 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:47 crc kubenswrapper[4918]: E0319 16:41:47.847168 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:48.347145716 +0000 UTC m=+120.469344964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.919926 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-pz8dd" podStartSLOduration=50.919899109 podStartE2EDuration="50.919899109s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:47.709644636 +0000 UTC m=+119.831843884" watchObservedRunningTime="2026-03-19 16:41:47.919899109 +0000 UTC m=+120.042098357" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.948579 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:47 crc kubenswrapper[4918]: E0319 16:41:47.949011 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:48.448997491 +0000 UTC m=+120.571196739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.986738 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-q8kcw" podStartSLOduration=50.973022032 podStartE2EDuration="50.973022032s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:47.922577199 +0000 UTC m=+120.044776447" watchObservedRunningTime="2026-03-19 16:41:47.973022032 +0000 UTC m=+120.095221280" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.988229 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6hdb4"] Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.989356 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:41:47 crc kubenswrapper[4918]: I0319 16:41:47.994751 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.028872 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6hdb4"] Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.041473 4918 ???:1] "http: TLS handshake error from 192.168.126.11:54284: no serving certificate available for the kubelet" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.055391 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.055660 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2tf\" (UniqueName: \"kubernetes.io/projected/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-kube-api-access-5s2tf\") pod \"community-operators-6hdb4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.055756 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-catalog-content\") pod \"community-operators-6hdb4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.055789 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-utilities\") pod \"community-operators-6hdb4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:41:48 crc kubenswrapper[4918]: E0319 16:41:48.055897 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:48.555880394 +0000 UTC m=+120.678079642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.123948 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s2zbj"] Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.148168 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.160070 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.160251 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-catalog-content\") pod \"community-operators-6hdb4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.160301 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-utilities\") pod \"community-operators-6hdb4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.160336 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2tf\" (UniqueName: \"kubernetes.io/projected/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-kube-api-access-5s2tf\") pod \"community-operators-6hdb4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:41:48 crc kubenswrapper[4918]: E0319 16:41:48.161088 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:48.661075777 +0000 UTC m=+120.783275025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.162200 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-catalog-content\") pod \"community-operators-6hdb4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.162437 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-utilities\") pod \"community-operators-6hdb4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.177671 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.192724 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2zbj"] Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.255764 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2tf\" (UniqueName: \"kubernetes.io/projected/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-kube-api-access-5s2tf\") pod \"community-operators-6hdb4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.261157 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.261402 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-catalog-content\") pod \"certified-operators-s2zbj\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.261429 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-utilities\") pod \"certified-operators-s2zbj\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.261469 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8zbj\" (UniqueName: \"kubernetes.io/projected/66b9142f-4eaf-41a0-9b13-dae083686eec-kube-api-access-t8zbj\") pod \"certified-operators-s2zbj\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:41:48 crc kubenswrapper[4918]: E0319 16:41:48.261643 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:48.761626814 +0000 UTC m=+120.883826062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.298821 4918 ???:1] "http: TLS handshake error from 192.168.126.11:54292: no serving certificate available for the kubelet" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.321825 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" podStartSLOduration=52.321802655 podStartE2EDuration="52.321802655s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:48.318378783 +0000 UTC m=+120.440578031" watchObservedRunningTime="2026-03-19 16:41:48.321802655 +0000 UTC m=+120.444001903" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.323144 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" podStartSLOduration=51.323138034 podStartE2EDuration="51.323138034s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:48.2780666 +0000 UTC m=+120.400265848" watchObservedRunningTime="2026-03-19 16:41:48.323138034 +0000 UTC m=+120.445337282" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.328924 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gmjc7"] Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.329817 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.344600 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.359214 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gmjc7"] Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.366459 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.366660 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-catalog-content\") pod \"certified-operators-s2zbj\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.366722 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-utilities\") pod \"certified-operators-s2zbj\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.366794 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8zbj\" (UniqueName: \"kubernetes.io/projected/66b9142f-4eaf-41a0-9b13-dae083686eec-kube-api-access-t8zbj\") pod \"certified-operators-s2zbj\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.367598 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-catalog-content\") pod \"certified-operators-s2zbj\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:41:48 crc kubenswrapper[4918]: E0319 16:41:48.367934 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:48.86791865 +0000 UTC m=+120.990117898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.370106 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-utilities\") pod \"certified-operators-s2zbj\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.398342 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8zbj\" (UniqueName: \"kubernetes.io/projected/66b9142f-4eaf-41a0-9b13-dae083686eec-kube-api-access-t8zbj\") pod \"certified-operators-s2zbj\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.447626 4918 ???:1] "http: TLS handshake error from 192.168.126.11:54308: no serving certificate available for the kubelet" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.474038 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.474283 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-catalog-content\") pod \"community-operators-gmjc7\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.474330 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-utilities\") pod \"community-operators-gmjc7\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.474412 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljjpr\" (UniqueName: \"kubernetes.io/projected/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-kube-api-access-ljjpr\") pod \"community-operators-gmjc7\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:41:48 crc kubenswrapper[4918]: E0319 16:41:48.474495 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:48.974476804 +0000 UTC m=+121.096676052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.475132 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.486943 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.505963 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jb6n7" podStartSLOduration=9.505939094 podStartE2EDuration="9.505939094s" podCreationTimestamp="2026-03-19 16:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:48.473203716 +0000 UTC m=+120.595402964" watchObservedRunningTime="2026-03-19 16:41:48.505939094 +0000 UTC m=+120.628138342" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.544783 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dh9m5"] Mar 19 16:41:48 crc kubenswrapper[4918]: E0319 16:41:48.545413 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de76665-62ca-42ca-94ad-537d7789d1a1" containerName="route-controller-manager" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.545426 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de76665-62ca-42ca-94ad-537d7789d1a1" containerName="route-controller-manager" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.545548 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de76665-62ca-42ca-94ad-537d7789d1a1" containerName="route-controller-manager" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.546272 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.573219 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dh9m5"] Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.579091 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-config\") pod \"7de76665-62ca-42ca-94ad-537d7789d1a1\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.579286 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7de76665-62ca-42ca-94ad-537d7789d1a1-serving-cert\") pod \"7de76665-62ca-42ca-94ad-537d7789d1a1\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.579339 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-client-ca\") pod \"7de76665-62ca-42ca-94ad-537d7789d1a1\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.579416 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z55k4\" (UniqueName: \"kubernetes.io/projected/7de76665-62ca-42ca-94ad-537d7789d1a1-kube-api-access-z55k4\") pod \"7de76665-62ca-42ca-94ad-537d7789d1a1\" (UID: \"7de76665-62ca-42ca-94ad-537d7789d1a1\") " Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.579692 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-utilities\") pod \"community-operators-gmjc7\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.579728 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.579784 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljjpr\" (UniqueName: \"kubernetes.io/projected/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-kube-api-access-ljjpr\") pod \"community-operators-gmjc7\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.579840 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-catalog-content\") pod \"community-operators-gmjc7\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.580748 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-catalog-content\") pod \"community-operators-gmjc7\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.580824 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-utilities\") pod \"community-operators-gmjc7\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:41:48 crc kubenswrapper[4918]: E0319 16:41:48.581035 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:49.081021437 +0000 UTC m=+121.203220685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.592950 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-client-ca" (OuterVolumeSpecName: "client-ca") pod "7de76665-62ca-42ca-94ad-537d7789d1a1" (UID: "7de76665-62ca-42ca-94ad-537d7789d1a1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.622429 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de76665-62ca-42ca-94ad-537d7789d1a1-kube-api-access-z55k4" (OuterVolumeSpecName: "kube-api-access-z55k4") pod "7de76665-62ca-42ca-94ad-537d7789d1a1" (UID: "7de76665-62ca-42ca-94ad-537d7789d1a1"). InnerVolumeSpecName "kube-api-access-z55k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.629387 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-config" (OuterVolumeSpecName: "config") pod "7de76665-62ca-42ca-94ad-537d7789d1a1" (UID: "7de76665-62ca-42ca-94ad-537d7789d1a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.673508 4918 ???:1] "http: TLS handshake error from 192.168.126.11:54458: no serving certificate available for the kubelet" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.679682 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljjpr\" (UniqueName: \"kubernetes.io/projected/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-kube-api-access-ljjpr\") pod \"community-operators-gmjc7\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.680480 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.680661 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.681105 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnjw7\" (UniqueName: \"kubernetes.io/projected/83708522-86b5-47d3-9f69-3bb7a645bb39-kube-api-access-dnjw7\") pod \"certified-operators-dh9m5\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.681139 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-utilities\") pod \"certified-operators-dh9m5\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.681172 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-catalog-content\") pod \"certified-operators-dh9m5\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.681248 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z55k4\" (UniqueName: \"kubernetes.io/projected/7de76665-62ca-42ca-94ad-537d7789d1a1-kube-api-access-z55k4\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.681266 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.694847 4918 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7de76665-62ca-42ca-94ad-537d7789d1a1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.692679 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de76665-62ca-42ca-94ad-537d7789d1a1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7de76665-62ca-42ca-94ad-537d7789d1a1" (UID: "7de76665-62ca-42ca-94ad-537d7789d1a1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:48 crc kubenswrapper[4918]: E0319 16:41:48.704288 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:49.194864316 +0000 UTC m=+121.317063564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.770098 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-w2fmz" event={"ID":"ccf81a8c-f372-4162-81a1-a3df96e88bf1","Type":"ContainerStarted","Data":"077eb445bb9d7ca071051b26dd9009328f77767f73038df2d7591568b662ff1f"} Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.785127 4918 ???:1] "http: TLS handshake error from 192.168.126.11:54470: no serving certificate available for the kubelet" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.803856 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:48 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:48 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:48 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.803935 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.806459 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.812819 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnjw7\" (UniqueName: \"kubernetes.io/projected/83708522-86b5-47d3-9f69-3bb7a645bb39-kube-api-access-dnjw7\") pod \"certified-operators-dh9m5\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.812943 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-utilities\") pod \"certified-operators-dh9m5\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.813014 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-catalog-content\") pod \"certified-operators-dh9m5\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.813201 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7de76665-62ca-42ca-94ad-537d7789d1a1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.814268 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-catalog-content\") pod \"certified-operators-dh9m5\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:41:48 crc kubenswrapper[4918]: E0319 16:41:48.814678 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:49.314659242 +0000 UTC m=+121.436858490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.815321 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-utilities\") pod \"certified-operators-dh9m5\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.865269 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2"] Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.915600 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:48 crc kubenswrapper[4918]: E0319 16:41:48.917171 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:49.417147565 +0000 UTC m=+121.539346813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.934170 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnjw7\" (UniqueName: \"kubernetes.io/projected/83708522-86b5-47d3-9f69-3bb7a645bb39-kube-api-access-dnjw7\") pod \"certified-operators-dh9m5\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.938184 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" event={"ID":"9e8d955e-01e0-4fe0-a713-20f4e83f8cca","Type":"ContainerStarted","Data":"e5b6baec369b52d5c9c0297e79963938ba690a7086aa764a83ae9370b3e79449"} Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.938257 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" event={"ID":"9e8d955e-01e0-4fe0-a713-20f4e83f8cca","Type":"ContainerStarted","Data":"c33bb04e2e511ccf5d2f8c3cbafe2d86da2edd36718c06782b503ec39947c05d"} Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.967999 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.968512 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.969959 4918 ???:1] "http: TLS handshake error from 192.168.126.11:54480: no serving certificate available for the kubelet" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.970321 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2"] Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.999471 4918 generic.go:334] "Generic (PLEG): container finished" podID="7de76665-62ca-42ca-94ad-537d7789d1a1" containerID="70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3" exitCode=0 Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.999591 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" event={"ID":"7de76665-62ca-42ca-94ad-537d7789d1a1","Type":"ContainerDied","Data":"70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3"} Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.999630 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" event={"ID":"7de76665-62ca-42ca-94ad-537d7789d1a1","Type":"ContainerDied","Data":"ee8d8d074113fcb382e551ee68a15450cdb148bc818a9686e41d5a91ea3d188a"} Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.999663 4918 scope.go:117] "RemoveContainer" containerID="70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3" Mar 19 16:41:48 crc kubenswrapper[4918]: I0319 16:41:48.999798 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.019016 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.019378 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:49.519364891 +0000 UTC m=+121.641564139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.022630 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6hdb4"] Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.032897 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9ft8l" event={"ID":"e7c2e54b-e056-4df4-b565-45f068f9f8da","Type":"ContainerStarted","Data":"0069c41a7f357b626a74dc7b2332e356daaf466bd30563d9d498402e246602d4"} Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.069486 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw"] Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.071724 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qcgd2" event={"ID":"69770981-c309-4aa4-ba5a-29bf78372aae","Type":"ContainerStarted","Data":"4f314dfba7ece7e39a6e15b56c38a01507cb84a10163651c809df9deb936a823"} Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.071773 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qcgd2" event={"ID":"69770981-c309-4aa4-ba5a-29bf78372aae","Type":"ContainerStarted","Data":"8762db8a15e35be605abe4a5207c62008e0dc605e237992cd651333972402dca"} Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.102619 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvdlw"] Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.122014 4918 scope.go:117] "RemoveContainer" containerID="70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.122908 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.123190 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-client-ca\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.123348 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xz4k\" (UniqueName: \"kubernetes.io/projected/10d16695-54c5-4302-9365-7f9f057d2cb5-kube-api-access-9xz4k\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.123388 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10d16695-54c5-4302-9365-7f9f057d2cb5-serving-cert\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.123412 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-config\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.123509 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:49.623494652 +0000 UTC m=+121.745693900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.139422 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3\": container with ID starting with 70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3 not found: ID does not exist" containerID="70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.139472 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3"} err="failed to get container status \"70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3\": rpc error: code = NotFound desc = could not find container \"70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3\": container with ID starting with 70711dbb4567275dab6503c8ff228db982c8baef74cf6d7e628c688dcce65bc3 not found: ID does not exist" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.141350 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" event={"ID":"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f","Type":"ContainerStarted","Data":"06b61acc27cfed4b4c443ba664bb25b000af42c8d275b16f4349186c8f4f148a"} Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.180892 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" event={"ID":"49540bde-3719-4e5d-acf8-ee877d99f581","Type":"ContainerStarted","Data":"6fc8eb73ebd10724fe89a8f63bc7428790a1eadb56d9d3983fea697780f949fb"} Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.193313 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" podUID="ef184530-a1ee-415c-a683-0588bf7f3ffb" containerName="controller-manager" containerID="cri-o://9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d" gracePeriod=30 Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.194448 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jb6n7" event={"ID":"141bb3c3-ea35-4df5-8951-2c6423d792cc","Type":"ContainerStarted","Data":"d60af52d34ab15ac33c4064b5ba6f92df720a6688f3385c110d53f6cfc64267c"} Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.194877 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" podUID="6756a1df-386b-4ee8-954b-bb4ae3829e58" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" gracePeriod=30 Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.196270 4918 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k4qc7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.196322 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" podUID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.225791 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" podStartSLOduration=53.225769479 podStartE2EDuration="53.225769479s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:49.178187551 +0000 UTC m=+121.300386799" watchObservedRunningTime="2026-03-19 16:41:49.225769479 +0000 UTC m=+121.347968727" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.226684 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qcgd2" podStartSLOduration=53.226677957 podStartE2EDuration="53.226677957s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:49.221641807 +0000 UTC m=+121.343841055" watchObservedRunningTime="2026-03-19 16:41:49.226677957 +0000 UTC m=+121.348877205" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.226781 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xz4k\" (UniqueName: \"kubernetes.io/projected/10d16695-54c5-4302-9365-7f9f057d2cb5-kube-api-access-9xz4k\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.226867 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10d16695-54c5-4302-9365-7f9f057d2cb5-serving-cert\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.226916 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-config\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.226948 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.226967 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-client-ca\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.229015 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-config\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.229491 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:49.72947769 +0000 UTC m=+121.851676938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.230179 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-client-ca\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.243904 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10d16695-54c5-4302-9365-7f9f057d2cb5-serving-cert\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.250641 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83708522_86b5_47d3_9f69_3bb7a645bb39.slice\": RecentStats: unable to find data in memory cache]" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.253036 4918 ???:1] "http: TLS handshake error from 192.168.126.11:54488: no serving certificate available for the kubelet" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.309228 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" podStartSLOduration=52.309198949 podStartE2EDuration="52.309198949s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:49.295095902 +0000 UTC m=+121.417295140" watchObservedRunningTime="2026-03-19 16:41:49.309198949 +0000 UTC m=+121.431398197" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.328157 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.328364 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:49.828339206 +0000 UTC m=+121.950538444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.330638 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.334103 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:49.834085986 +0000 UTC m=+121.956285234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.334360 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xz4k\" (UniqueName: \"kubernetes.io/projected/10d16695-54c5-4302-9365-7f9f057d2cb5-kube-api-access-9xz4k\") pod \"route-controller-manager-58bbccc84d-dxks2\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.409195 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4ztb9" podStartSLOduration=52.409161107 podStartE2EDuration="52.409161107s" podCreationTimestamp="2026-03-19 16:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:49.403102968 +0000 UTC m=+121.525302216" watchObservedRunningTime="2026-03-19 16:41:49.409161107 +0000 UTC m=+121.531360355" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.433597 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.441639 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:49.941603188 +0000 UTC m=+122.063802436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.541672 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.542127 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:50.042109442 +0000 UTC m=+122.164308690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.623811 4918 ???:1] "http: TLS handshake error from 192.168.126.11:54500: no serving certificate available for the kubelet" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.633975 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.644130 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.644555 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:50.144509623 +0000 UTC m=+122.266708871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.649429 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2zbj"] Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.704180 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gmjc7"] Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.745397 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.745851 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:50.245837023 +0000 UTC m=+122.368036271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:49 crc kubenswrapper[4918]: W0319 16:41:49.759096 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c986e2_96e9_4f2c_9b13_8cf09b8d0480.slice/crio-47b5d37bad3be84f8ef9d7c9d7badb34880879d7dd5e2dae1834d15d00ad3fe8 WatchSource:0}: Error finding container 47b5d37bad3be84f8ef9d7c9d7badb34880879d7dd5e2dae1834d15d00ad3fe8: Status 404 returned error can't find the container with id 47b5d37bad3be84f8ef9d7c9d7badb34880879d7dd5e2dae1834d15d00ad3fe8 Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.781569 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dh9m5"] Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.781776 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:49 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:49 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:49 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.781822 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.846658 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.847035 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:50.347020087 +0000 UTC m=+122.469219335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.931870 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5nsgc"] Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.933436 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.937235 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.955150 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.956390 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5nsgc"] Mar 19 16:41:49 crc kubenswrapper[4918]: E0319 16:41:49.973290 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:50.473272024 +0000 UTC m=+122.595471272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:49 crc kubenswrapper[4918]: I0319 16:41:49.981684 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.056017 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7fj9\" (UniqueName: \"kubernetes.io/projected/ef184530-a1ee-415c-a683-0588bf7f3ffb-kube-api-access-r7fj9\") pod \"ef184530-a1ee-415c-a683-0588bf7f3ffb\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.056113 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-proxy-ca-bundles\") pod \"ef184530-a1ee-415c-a683-0588bf7f3ffb\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.056178 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-client-ca\") pod \"ef184530-a1ee-415c-a683-0588bf7f3ffb\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.056198 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef184530-a1ee-415c-a683-0588bf7f3ffb-serving-cert\") pod \"ef184530-a1ee-415c-a683-0588bf7f3ffb\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.056336 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.056386 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-config\") pod \"ef184530-a1ee-415c-a683-0588bf7f3ffb\" (UID: \"ef184530-a1ee-415c-a683-0588bf7f3ffb\") " Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.056558 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-utilities\") pod \"redhat-marketplace-5nsgc\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.056614 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-catalog-content\") pod \"redhat-marketplace-5nsgc\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.056674 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmx82\" (UniqueName: \"kubernetes.io/projected/8824182c-653f-4719-87ac-38d3c9c44f12-kube-api-access-rmx82\") pod \"redhat-marketplace-5nsgc\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.059889 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:50.559856327 +0000 UTC m=+122.682055565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.065394 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ef184530-a1ee-415c-a683-0588bf7f3ffb" (UID: "ef184530-a1ee-415c-a683-0588bf7f3ffb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.065405 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-client-ca" (OuterVolumeSpecName: "client-ca") pod "ef184530-a1ee-415c-a683-0588bf7f3ffb" (UID: "ef184530-a1ee-415c-a683-0588bf7f3ffb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.065798 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-config" (OuterVolumeSpecName: "config") pod "ef184530-a1ee-415c-a683-0588bf7f3ffb" (UID: "ef184530-a1ee-415c-a683-0588bf7f3ffb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.074015 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef184530-a1ee-415c-a683-0588bf7f3ffb-kube-api-access-r7fj9" (OuterVolumeSpecName: "kube-api-access-r7fj9") pod "ef184530-a1ee-415c-a683-0588bf7f3ffb" (UID: "ef184530-a1ee-415c-a683-0588bf7f3ffb"). InnerVolumeSpecName "kube-api-access-r7fj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.074678 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef184530-a1ee-415c-a683-0588bf7f3ffb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ef184530-a1ee-415c-a683-0588bf7f3ffb" (UID: "ef184530-a1ee-415c-a683-0588bf7f3ffb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.097316 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2"] Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.157748 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.157810 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmx82\" (UniqueName: \"kubernetes.io/projected/8824182c-653f-4719-87ac-38d3c9c44f12-kube-api-access-rmx82\") pod \"redhat-marketplace-5nsgc\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.157862 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-utilities\") pod \"redhat-marketplace-5nsgc\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.157914 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-catalog-content\") pod \"redhat-marketplace-5nsgc\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.157956 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.157969 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7fj9\" (UniqueName: \"kubernetes.io/projected/ef184530-a1ee-415c-a683-0588bf7f3ffb-kube-api-access-r7fj9\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.157978 4918 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.157987 4918 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef184530-a1ee-415c-a683-0588bf7f3ffb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.157996 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef184530-a1ee-415c-a683-0588bf7f3ffb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.158183 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:50.658159566 +0000 UTC m=+122.780358814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.158398 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-catalog-content\") pod \"redhat-marketplace-5nsgc\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.160295 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-utilities\") pod \"redhat-marketplace-5nsgc\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.177411 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmx82\" (UniqueName: \"kubernetes.io/projected/8824182c-653f-4719-87ac-38d3c9c44f12-kube-api-access-rmx82\") pod \"redhat-marketplace-5nsgc\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.202920 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" event={"ID":"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f","Type":"ContainerStarted","Data":"cfedce66c147eea4406770390631e409454a9653335ddc65fdbef774de5261c6"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.205333 4918 generic.go:334] "Generic (PLEG): container finished" podID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" containerID="cb7cbd1eebe92aa1eed60d4a64625db22f9a495da4e5b2f4d7c41bbd6a9549e1" exitCode=0 Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.205400 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmjc7" event={"ID":"42c986e2-96e9-4f2c-9b13-8cf09b8d0480","Type":"ContainerDied","Data":"cb7cbd1eebe92aa1eed60d4a64625db22f9a495da4e5b2f4d7c41bbd6a9549e1"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.205427 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmjc7" event={"ID":"42c986e2-96e9-4f2c-9b13-8cf09b8d0480","Type":"ContainerStarted","Data":"47b5d37bad3be84f8ef9d7c9d7badb34880879d7dd5e2dae1834d15d00ad3fe8"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.208318 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.211277 4918 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.215947 4918 generic.go:334] "Generic (PLEG): container finished" podID="ef184530-a1ee-415c-a683-0588bf7f3ffb" containerID="9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d" exitCode=0 Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.216024 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" event={"ID":"ef184530-a1ee-415c-a683-0588bf7f3ffb","Type":"ContainerDied","Data":"9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.216056 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" event={"ID":"ef184530-a1ee-415c-a683-0588bf7f3ffb","Type":"ContainerDied","Data":"f6c82dd2a3bd4cd56dce3d2098e7534abf327b7be4eba2b7f633821e85e2248e"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.216078 4918 scope.go:117] "RemoveContainer" containerID="9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.216207 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-84tp9" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.220272 4918 generic.go:334] "Generic (PLEG): container finished" podID="83708522-86b5-47d3-9f69-3bb7a645bb39" containerID="d932a36415ff851986d239d5a987296f51839e1196171437055ae52e33d17f66" exitCode=0 Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.220499 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh9m5" event={"ID":"83708522-86b5-47d3-9f69-3bb7a645bb39","Type":"ContainerDied","Data":"d932a36415ff851986d239d5a987296f51839e1196171437055ae52e33d17f66"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.220653 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh9m5" event={"ID":"83708522-86b5-47d3-9f69-3bb7a645bb39","Type":"ContainerStarted","Data":"aec72e2612d36f04345b85c23b98810b0e27a47906429b9cf82861b6f1a3fb79"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.234055 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" event={"ID":"f2bedb86-27a0-40a4-a97c-10f1f287fc00","Type":"ContainerStarted","Data":"ae3139048d26cd8cfd0a9fc88c0a822e7b7a804778d4dda01e817d62e42907a4"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.236669 4918 generic.go:334] "Generic (PLEG): container finished" podID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" containerID="a1184f0c2c04e5fc171b53c53d76e9a3db95a14c25fa1a687179efcec6de688e" exitCode=0 Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.236802 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hdb4" event={"ID":"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4","Type":"ContainerDied","Data":"a1184f0c2c04e5fc171b53c53d76e9a3db95a14c25fa1a687179efcec6de688e"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.236895 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hdb4" event={"ID":"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4","Type":"ContainerStarted","Data":"0528cfd58a2100758dab6613ae9f5506be96ade583ebefc128dda6f2ee0d8e93"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.238578 4918 generic.go:334] "Generic (PLEG): container finished" podID="66b9142f-4eaf-41a0-9b13-dae083686eec" containerID="86a1ea060fa744e5670c416a1a11e9ad78ab1d157a8d3f65711eac9b49e105de" exitCode=0 Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.239631 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2zbj" event={"ID":"66b9142f-4eaf-41a0-9b13-dae083686eec","Type":"ContainerDied","Data":"86a1ea060fa744e5670c416a1a11e9ad78ab1d157a8d3f65711eac9b49e105de"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.239827 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2zbj" event={"ID":"66b9142f-4eaf-41a0-9b13-dae083686eec","Type":"ContainerStarted","Data":"e065d697832d51a2809ef9242e62343d931ddbd8c3b365284b8a396811867840"} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.252754 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:41:50 crc kubenswrapper[4918]: W0319 16:41:50.266610 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10d16695_54c5_4302_9365_7f9f057d2cb5.slice/crio-fa0ef346e09bfd23cbc623fd846ecaaa17e4de01b1cbffea4998fd26a3a98edf WatchSource:0}: Error finding container fa0ef346e09bfd23cbc623fd846ecaaa17e4de01b1cbffea4998fd26a3a98edf: Status 404 returned error can't find the container with id fa0ef346e09bfd23cbc623fd846ecaaa17e4de01b1cbffea4998fd26a3a98edf Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.270253 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.270708 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:50.770681927 +0000 UTC m=+122.892881175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.271073 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.272706 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:50.772695326 +0000 UTC m=+122.894894574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.291058 4918 scope.go:117] "RemoveContainer" containerID="9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d" Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.294273 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d\": container with ID starting with 9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d not found: ID does not exist" containerID="9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.294347 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d"} err="failed to get container status \"9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d\": rpc error: code = NotFound desc = could not find container \"9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d\": container with ID starting with 9ef7f33de7b163d091b1331cc9ee0e5d911d4d0eef9e2dfd0ac1b4e9925c085d not found: ID does not exist" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.311654 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t4mc2"] Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.311876 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef184530-a1ee-415c-a683-0588bf7f3ffb" containerName="controller-manager" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.311890 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef184530-a1ee-415c-a683-0588bf7f3ffb" containerName="controller-manager" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.312019 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef184530-a1ee-415c-a683-0588bf7f3ffb" containerName="controller-manager" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.314693 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.336771 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.363882 4918 ???:1] "http: TLS handshake error from 192.168.126.11:54514: no serving certificate available for the kubelet" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.380074 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.380417 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-catalog-content\") pod \"redhat-marketplace-t4mc2\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.380575 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5hqs\" (UniqueName: \"kubernetes.io/projected/6c06d493-b3ec-42b0-9050-48e45aa277fe-kube-api-access-p5hqs\") pod \"redhat-marketplace-t4mc2\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.380712 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-utilities\") pod \"redhat-marketplace-t4mc2\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.381772 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:50.881744964 +0000 UTC m=+123.003944212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.406999 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4mc2"] Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.427415 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84tp9"] Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.427651 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-84tp9"] Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.482743 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-catalog-content\") pod \"redhat-marketplace-t4mc2\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.482810 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5hqs\" (UniqueName: \"kubernetes.io/projected/6c06d493-b3ec-42b0-9050-48e45aa277fe-kube-api-access-p5hqs\") pod \"redhat-marketplace-t4mc2\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.482858 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-utilities\") pod \"redhat-marketplace-t4mc2\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.482884 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.483341 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:50.983326191 +0000 UTC m=+123.105525439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.483882 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-catalog-content\") pod \"redhat-marketplace-t4mc2\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.484115 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-utilities\") pod \"redhat-marketplace-t4mc2\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.515460 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5hqs\" (UniqueName: \"kubernetes.io/projected/6c06d493-b3ec-42b0-9050-48e45aa277fe-kube-api-access-p5hqs\") pod \"redhat-marketplace-t4mc2\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.586281 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.586707 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:51.08669039 +0000 UTC m=+123.208889638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.604115 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7de76665-62ca-42ca-94ad-537d7789d1a1" path="/var/lib/kubelet/pods/7de76665-62ca-42ca-94ad-537d7789d1a1/volumes" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.604901 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef184530-a1ee-415c-a683-0588bf7f3ffb" path="/var/lib/kubelet/pods/ef184530-a1ee-415c-a683-0588bf7f3ffb/volumes" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.621655 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5nsgc"] Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.654321 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.688113 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.688501 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:51.188488413 +0000 UTC m=+123.310687661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.777840 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:50 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:50 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:50 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.777918 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.787628 4918 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-19T16:41:50.21131593Z","Handler":null,"Name":""} Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.790797 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.791021 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 16:41:51.290984116 +0000 UTC m=+123.413183364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.791170 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:50 crc kubenswrapper[4918]: E0319 16:41:50.791671 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 16:41:51.291655237 +0000 UTC m=+123.413854475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-khrx9" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.814356 4918 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.814398 4918 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.834441 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77bdcb48b-dwq98"] Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.835198 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.837346 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.839306 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.840290 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.841248 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.841349 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.847265 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.849030 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.849570 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77bdcb48b-dwq98"] Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.892721 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.892988 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnzkv\" (UniqueName: \"kubernetes.io/projected/e58452ae-25b8-4d98-a9e1-073317c2b0f3-kube-api-access-rnzkv\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.893026 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-client-ca\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.893313 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-config\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.893387 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e58452ae-25b8-4d98-a9e1-073317c2b0f3-serving-cert\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.893456 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-proxy-ca-bundles\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.902779 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.937383 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4mc2"] Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.996718 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-config\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.996786 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e58452ae-25b8-4d98-a9e1-073317c2b0f3-serving-cert\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.996838 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-proxy-ca-bundles\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.996884 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnzkv\" (UniqueName: \"kubernetes.io/projected/e58452ae-25b8-4d98-a9e1-073317c2b0f3-kube-api-access-rnzkv\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.996912 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-client-ca\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.996941 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.998725 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-config\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:50 crc kubenswrapper[4918]: I0319 16:41:50.999297 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-client-ca\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.002239 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-proxy-ca-bundles\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.024080 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e58452ae-25b8-4d98-a9e1-073317c2b0f3-serving-cert\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.027906 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.027976 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.037378 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnzkv\" (UniqueName: \"kubernetes.io/projected/e58452ae-25b8-4d98-a9e1-073317c2b0f3-kube-api-access-rnzkv\") pod \"controller-manager-77bdcb48b-dwq98\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.088685 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-khrx9\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.116245 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7glsh"] Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.117555 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.122902 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.171661 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7glsh"] Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.200265 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-utilities\") pod \"redhat-operators-7glsh\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.200773 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-catalog-content\") pod \"redhat-operators-7glsh\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.200863 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nlcn\" (UniqueName: \"kubernetes.io/projected/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-kube-api-access-6nlcn\") pod \"redhat-operators-7glsh\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.217559 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.287495 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" event={"ID":"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f","Type":"ContainerStarted","Data":"3f2472b770f2e586bae0a809b39fef8bd671adf3ab8160824eb7514d4123b8dd"} Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.287663 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" event={"ID":"3376b5d5-0a1e-4370-bc45-ac3cf3c4e21f","Type":"ContainerStarted","Data":"18f573cd6f0ab21925d0832b086e88a80da4f004ac9aadea0e9c819bf29ffac7"} Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.290457 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" event={"ID":"10d16695-54c5-4302-9365-7f9f057d2cb5","Type":"ContainerStarted","Data":"e79be5749a748cc7db23359a4221ac6a9097fcee78f378528c319b4bc797bd7b"} Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.291095 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" event={"ID":"10d16695-54c5-4302-9365-7f9f057d2cb5","Type":"ContainerStarted","Data":"fa0ef346e09bfd23cbc623fd846ecaaa17e4de01b1cbffea4998fd26a3a98edf"} Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.291960 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.300930 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.305176 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-utilities\") pod \"redhat-operators-7glsh\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.305282 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-catalog-content\") pod \"redhat-operators-7glsh\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.305308 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nlcn\" (UniqueName: \"kubernetes.io/projected/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-kube-api-access-6nlcn\") pod \"redhat-operators-7glsh\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.305996 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-catalog-content\") pod \"redhat-operators-7glsh\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.306047 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-utilities\") pod \"redhat-operators-7glsh\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.307189 4918 generic.go:334] "Generic (PLEG): container finished" podID="6c06d493-b3ec-42b0-9050-48e45aa277fe" containerID="6ed1d94dc41467d73a557dea5235d5c282818d91996f78f7f0929aaa40d80e4b" exitCode=0 Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.307438 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4mc2" event={"ID":"6c06d493-b3ec-42b0-9050-48e45aa277fe","Type":"ContainerDied","Data":"6ed1d94dc41467d73a557dea5235d5c282818d91996f78f7f0929aaa40d80e4b"} Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.307488 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4mc2" event={"ID":"6c06d493-b3ec-42b0-9050-48e45aa277fe","Type":"ContainerStarted","Data":"f5b2e5b34d3cc59bb0db0d4522700864c4d7c239252b148123c41d7255384bc0"} Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.316554 4918 generic.go:334] "Generic (PLEG): container finished" podID="8824182c-653f-4719-87ac-38d3c9c44f12" containerID="3d3d3944cac444fb95dad3f5c3e158e3a9ef47f44c33e2b09f788035ba359b30" exitCode=0 Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.316641 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nsgc" event={"ID":"8824182c-653f-4719-87ac-38d3c9c44f12","Type":"ContainerDied","Data":"3d3d3944cac444fb95dad3f5c3e158e3a9ef47f44c33e2b09f788035ba359b30"} Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.316672 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nsgc" event={"ID":"8824182c-653f-4719-87ac-38d3c9c44f12","Type":"ContainerStarted","Data":"ff8e4ab1a28d5742d0155fe5f5279973224bf30b08b6516a3bd9872c7136f06b"} Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.363469 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nlcn\" (UniqueName: \"kubernetes.io/projected/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-kube-api-access-6nlcn\") pod \"redhat-operators-7glsh\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.366889 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jl5dc" podStartSLOduration=12.366868412 podStartE2EDuration="12.366868412s" podCreationTimestamp="2026-03-19 16:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:51.339195513 +0000 UTC m=+123.461394771" watchObservedRunningTime="2026-03-19 16:41:51.366868412 +0000 UTC m=+123.489067660" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.384818 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" podStartSLOduration=6.384793312 podStartE2EDuration="6.384793312s" podCreationTimestamp="2026-03-19 16:41:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:51.381224097 +0000 UTC m=+123.503423345" watchObservedRunningTime="2026-03-19 16:41:51.384793312 +0000 UTC m=+123.506992560" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.390487 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.399636 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.434592 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.519024 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rcmw9"] Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.520052 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.543975 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rcmw9"] Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.614202 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-utilities\") pod \"redhat-operators-rcmw9\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.614721 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxdh8\" (UniqueName: \"kubernetes.io/projected/3044e214-7f52-423c-98a6-03a05ed008a1-kube-api-access-dxdh8\") pod \"redhat-operators-rcmw9\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.614814 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-catalog-content\") pod \"redhat-operators-rcmw9\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.631706 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77bdcb48b-dwq98"] Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.750891 4918 ???:1] "http: TLS handshake error from 192.168.126.11:54520: no serving certificate available for the kubelet" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.751475 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-utilities\") pod \"redhat-operators-rcmw9\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.751556 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxdh8\" (UniqueName: \"kubernetes.io/projected/3044e214-7f52-423c-98a6-03a05ed008a1-kube-api-access-dxdh8\") pod \"redhat-operators-rcmw9\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.751610 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-catalog-content\") pod \"redhat-operators-rcmw9\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.763235 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-khrx9"] Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.774807 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-utilities\") pod \"redhat-operators-rcmw9\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.774893 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-catalog-content\") pod \"redhat-operators-rcmw9\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.778163 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:51 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:51 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:51 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.778221 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.800806 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxdh8\" (UniqueName: \"kubernetes.io/projected/3044e214-7f52-423c-98a6-03a05ed008a1-kube-api-access-dxdh8\") pod \"redhat-operators-rcmw9\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.878971 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.916944 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7glsh"] Mar 19 16:41:51 crc kubenswrapper[4918]: W0319 16:41:51.970818 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a9fdbd8_ec5d_4aa7_8097_a081455a27fa.slice/crio-4286954389cf85ccaaa67dd0f87e2a031fb44302ea1354a78f1de188c1226df8 WatchSource:0}: Error finding container 4286954389cf85ccaaa67dd0f87e2a031fb44302ea1354a78f1de188c1226df8: Status 404 returned error can't find the container with id 4286954389cf85ccaaa67dd0f87e2a031fb44302ea1354a78f1de188c1226df8 Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.973383 4918 patch_prober.go:28] interesting pod/downloads-7954f5f757-pz4gm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.973438 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-pz4gm" podUID="3d68da4e-0dc1-4835-99f3-a5703db9288e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.973702 4918 patch_prober.go:28] interesting pod/downloads-7954f5f757-pz4gm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.973732 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-pz4gm" podUID="3d68da4e-0dc1-4835-99f3-a5703db9288e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.988869 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.991825 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.995183 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 16:41:51 crc kubenswrapper[4918]: I0319 16:41:51.995883 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.003584 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.015764 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.015813 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.022596 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.022631 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.028600 4918 patch_prober.go:28] interesting pod/console-f9d7485db-td7k5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.028646 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-td7k5" podUID="5189c318-e4b1-4dd9-9a6d-284425d319cf" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.030001 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.030021 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.038987 4918 patch_prober.go:28] interesting pod/apiserver-76f77b778f-mnwrv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 16:41:52 crc kubenswrapper[4918]: [+]log ok Mar 19 16:41:52 crc kubenswrapper[4918]: [+]etcd ok Mar 19 16:41:52 crc kubenswrapper[4918]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 16:41:52 crc kubenswrapper[4918]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 16:41:52 crc kubenswrapper[4918]: [+]poststarthook/max-in-flight-filter ok Mar 19 16:41:52 crc kubenswrapper[4918]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 16:41:52 crc kubenswrapper[4918]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 19 16:41:52 crc kubenswrapper[4918]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 19 16:41:52 crc kubenswrapper[4918]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 19 16:41:52 crc kubenswrapper[4918]: [+]poststarthook/project.openshift.io-projectcache ok Mar 19 16:41:52 crc kubenswrapper[4918]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 19 16:41:52 crc kubenswrapper[4918]: [+]poststarthook/openshift.io-startinformers ok Mar 19 16:41:52 crc kubenswrapper[4918]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 19 16:41:52 crc kubenswrapper[4918]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 16:41:52 crc kubenswrapper[4918]: livez check failed Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.039043 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" podUID="9e8d955e-01e0-4fe0-a713-20f4e83f8cca" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.041316 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.055935 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1344358-7a37-4064-8781-a46f8a2fcef1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f1344358-7a37-4064-8781-a46f8a2fcef1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.055995 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1344358-7a37-4064-8781-a46f8a2fcef1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f1344358-7a37-4064-8781-a46f8a2fcef1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.158513 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1344358-7a37-4064-8781-a46f8a2fcef1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f1344358-7a37-4064-8781-a46f8a2fcef1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.160363 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1344358-7a37-4064-8781-a46f8a2fcef1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f1344358-7a37-4064-8781-a46f8a2fcef1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.160473 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1344358-7a37-4064-8781-a46f8a2fcef1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f1344358-7a37-4064-8781-a46f8a2fcef1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.203700 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1344358-7a37-4064-8781-a46f8a2fcef1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f1344358-7a37-4064-8781-a46f8a2fcef1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.338646 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.362717 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7glsh" event={"ID":"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa","Type":"ContainerStarted","Data":"4286954389cf85ccaaa67dd0f87e2a031fb44302ea1354a78f1de188c1226df8"} Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.367016 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" event={"ID":"e9417c6d-34fd-465b-b780-b88ee938f824","Type":"ContainerStarted","Data":"a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8"} Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.367053 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" event={"ID":"e9417c6d-34fd-465b-b780-b88ee938f824","Type":"ContainerStarted","Data":"d5f6e5100f49d9af21f45374ef41a705d684176cd304ef438a32b718dd285378"} Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.367090 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.395125 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" podStartSLOduration=56.395102375 podStartE2EDuration="56.395102375s" podCreationTimestamp="2026-03-19 16:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:52.392446606 +0000 UTC m=+124.514645854" watchObservedRunningTime="2026-03-19 16:41:52.395102375 +0000 UTC m=+124.517301623" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.418931 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" event={"ID":"e58452ae-25b8-4d98-a9e1-073317c2b0f3","Type":"ContainerStarted","Data":"eba824459c25f678452869559a3a0a7dc35335a81c079975a89ccff702265305"} Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.418980 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" event={"ID":"e58452ae-25b8-4d98-a9e1-073317c2b0f3","Type":"ContainerStarted","Data":"7133f861ef9d1ca6ea7024be3ed391df432cedb9d0a8c4d21a7b8e46df452efd"} Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.419895 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.426939 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.427227 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kspkb" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.444488 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" podStartSLOduration=7.4444647360000005 podStartE2EDuration="7.444464736s" podCreationTimestamp="2026-03-19 16:41:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:52.442105607 +0000 UTC m=+124.564304855" watchObservedRunningTime="2026-03-19 16:41:52.444464736 +0000 UTC m=+124.566663984" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.499888 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rcmw9"] Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.706664 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.745542 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.746285 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.752852 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.752927 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.765230 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.779235 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.832991 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:52 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:52 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:52 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.833067 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.889332 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.889472 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.992773 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.992917 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 16:41:52 crc kubenswrapper[4918]: I0319 16:41:52.995901 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.037600 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.103369 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.152462 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 16:41:53 crc kubenswrapper[4918]: E0319 16:41:53.298368 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:41:53 crc kubenswrapper[4918]: E0319 16:41:53.304880 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.442181 4918 generic.go:334] "Generic (PLEG): container finished" podID="3044e214-7f52-423c-98a6-03a05ed008a1" containerID="f2e796adf0785bd5590b7388cfa305a7840be9acc6d030eec8f3958bfb276e6f" exitCode=0 Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.442254 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcmw9" event={"ID":"3044e214-7f52-423c-98a6-03a05ed008a1","Type":"ContainerDied","Data":"f2e796adf0785bd5590b7388cfa305a7840be9acc6d030eec8f3958bfb276e6f"} Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.442285 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcmw9" event={"ID":"3044e214-7f52-423c-98a6-03a05ed008a1","Type":"ContainerStarted","Data":"69f82f247a41372cd6a93e76e85b293b66b67bf3188b2b35325c53082118f29a"} Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.446948 4918 generic.go:334] "Generic (PLEG): container finished" podID="fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0" containerID="d050a608d4cfc3d08f3a991c152c00b6609359ff283e54770c428e3a203f6df1" exitCode=0 Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.447004 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" event={"ID":"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0","Type":"ContainerDied","Data":"d050a608d4cfc3d08f3a991c152c00b6609359ff283e54770c428e3a203f6df1"} Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.448673 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f1344358-7a37-4064-8781-a46f8a2fcef1","Type":"ContainerStarted","Data":"61e74ec62b761f7e3cb85594af1f5ecc03cffef43a7ac570d477256fa264d8a2"} Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.453576 4918 generic.go:334] "Generic (PLEG): container finished" podID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerID="25e01fc3acb7af8838d288d6613ef10f52abf0e5bdcb50f724e1f3ba08e25df9" exitCode=0 Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.454329 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7glsh" event={"ID":"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa","Type":"ContainerDied","Data":"25e01fc3acb7af8838d288d6613ef10f52abf0e5bdcb50f724e1f3ba08e25df9"} Mar 19 16:41:53 crc kubenswrapper[4918]: E0319 16:41:53.616033 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:41:53 crc kubenswrapper[4918]: E0319 16:41:53.616487 4918 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" podUID="6756a1df-386b-4ee8-954b-bb4ae3829e58" containerName="kube-multus-additional-cni-plugins" Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.797867 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:53 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:53 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:53 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.797932 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:53 crc kubenswrapper[4918]: I0319 16:41:53.971808 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 16:41:54 crc kubenswrapper[4918]: I0319 16:41:54.350750 4918 ???:1] "http: TLS handshake error from 192.168.126.11:54528: no serving certificate available for the kubelet" Mar 19 16:41:54 crc kubenswrapper[4918]: I0319 16:41:54.522737 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9","Type":"ContainerStarted","Data":"099fa5a547d46cbfc79b050eb0a2941ce44d8a8a74f9409d86bdbd2eaa32eb22"} Mar 19 16:41:54 crc kubenswrapper[4918]: I0319 16:41:54.771913 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:54 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:54 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:54 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:54 crc kubenswrapper[4918]: I0319 16:41:54.772440 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:54 crc kubenswrapper[4918]: I0319 16:41:54.928963 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.042072 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-config-volume\") pod \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.042235 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-secret-volume\") pod \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.042265 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89j9d\" (UniqueName: \"kubernetes.io/projected/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-kube-api-access-89j9d\") pod \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\" (UID: \"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0\") " Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.043312 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0" (UID: "fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.052136 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-kube-api-access-89j9d" (OuterVolumeSpecName: "kube-api-access-89j9d") pod "fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0" (UID: "fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0"). InnerVolumeSpecName "kube-api-access-89j9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.064385 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0" (UID: "fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.143799 4918 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.143850 4918 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.143863 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89j9d\" (UniqueName: \"kubernetes.io/projected/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0-kube-api-access-89j9d\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.544016 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9","Type":"ContainerStarted","Data":"6d4a51ad09825240fe8461d640aa4c6c561a5b91dbff5f52cabdb255f1281dfc"} Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.547374 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.547363 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn" event={"ID":"fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0","Type":"ContainerDied","Data":"a9897dabd415fe600f130cd516fe6ea7a0b7059d1bb6b99b7926a24742725fa2"} Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.547425 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9897dabd415fe600f130cd516fe6ea7a0b7059d1bb6b99b7926a24742725fa2" Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.557478 4918 generic.go:334] "Generic (PLEG): container finished" podID="f1344358-7a37-4064-8781-a46f8a2fcef1" containerID="c0e6b4bb720c2d630cc9ca9342064d655419531e6d135d0b108b88a9c53a0d98" exitCode=0 Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.557557 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f1344358-7a37-4064-8781-a46f8a2fcef1","Type":"ContainerDied","Data":"c0e6b4bb720c2d630cc9ca9342064d655419531e6d135d0b108b88a9c53a0d98"} Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.559856 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.559836415 podStartE2EDuration="3.559836415s" podCreationTimestamp="2026-03-19 16:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:55.557031532 +0000 UTC m=+127.679230780" watchObservedRunningTime="2026-03-19 16:41:55.559836415 +0000 UTC m=+127.682035663" Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.770832 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:55 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:55 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:55 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:55 crc kubenswrapper[4918]: I0319 16:41:55.770911 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.360533 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.360676 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.362857 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.363202 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.372621 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.390865 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.462142 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.462265 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.465185 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.474977 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.489432 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.491406 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.509502 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.524413 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.574952 4918 generic.go:334] "Generic (PLEG): container finished" podID="67cbf560-f84a-45a7-8ab5-4ba310f9a4c9" containerID="6d4a51ad09825240fe8461d640aa4c6c561a5b91dbff5f52cabdb255f1281dfc" exitCode=0 Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.575058 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9","Type":"ContainerDied","Data":"6d4a51ad09825240fe8461d640aa4c6c561a5b91dbff5f52cabdb255f1281dfc"} Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.610010 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.770167 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:56 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:56 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:56 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.770255 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:56 crc kubenswrapper[4918]: I0319 16:41:56.991752 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.033234 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.055033 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-mnwrv" Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.073553 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1344358-7a37-4064-8781-a46f8a2fcef1-kubelet-dir\") pod \"f1344358-7a37-4064-8781-a46f8a2fcef1\" (UID: \"f1344358-7a37-4064-8781-a46f8a2fcef1\") " Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.073632 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1344358-7a37-4064-8781-a46f8a2fcef1-kube-api-access\") pod \"f1344358-7a37-4064-8781-a46f8a2fcef1\" (UID: \"f1344358-7a37-4064-8781-a46f8a2fcef1\") " Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.075141 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1344358-7a37-4064-8781-a46f8a2fcef1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f1344358-7a37-4064-8781-a46f8a2fcef1" (UID: "f1344358-7a37-4064-8781-a46f8a2fcef1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.089170 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1344358-7a37-4064-8781-a46f8a2fcef1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f1344358-7a37-4064-8781-a46f8a2fcef1" (UID: "f1344358-7a37-4064-8781-a46f8a2fcef1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.177619 4918 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1344358-7a37-4064-8781-a46f8a2fcef1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.177685 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1344358-7a37-4064-8781-a46f8a2fcef1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 16:41:57 crc kubenswrapper[4918]: W0319 16:41:57.327289 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-8b81a57bc3bf836918c814d9d5334992cd65b2fddc0e29941c5c5b16ee36ea1e WatchSource:0}: Error finding container 8b81a57bc3bf836918c814d9d5334992cd65b2fddc0e29941c5c5b16ee36ea1e: Status 404 returned error can't find the container with id 8b81a57bc3bf836918c814d9d5334992cd65b2fddc0e29941c5c5b16ee36ea1e Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.635892 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.635882 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f1344358-7a37-4064-8781-a46f8a2fcef1","Type":"ContainerDied","Data":"61e74ec62b761f7e3cb85594af1f5ecc03cffef43a7ac570d477256fa264d8a2"} Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.636055 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61e74ec62b761f7e3cb85594af1f5ecc03cffef43a7ac570d477256fa264d8a2" Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.641135 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"056e9f739a7332a8ca1f39178f6b77b9c58303a2ccca68a46882d7aca02b55a8"} Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.652033 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9e6f12f2533a96ee87019a98640051fef450f71e360a2ba15411959e37eda047"} Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.652099 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7d095c4988355ebfb2c959ad256b3c9e997a417918775ab775cbb85feaf7eed9"} Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.652396 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.667453 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8b81a57bc3bf836918c814d9d5334992cd65b2fddc0e29941c5c5b16ee36ea1e"} Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.772297 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:57 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:57 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:57 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:57 crc kubenswrapper[4918]: I0319 16:41:57.772407 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:58 crc kubenswrapper[4918]: I0319 16:41:58.002851 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jb6n7" Mar 19 16:41:58 crc kubenswrapper[4918]: I0319 16:41:58.291753 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:41:58 crc kubenswrapper[4918]: I0319 16:41:58.616862 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 16:41:58 crc kubenswrapper[4918]: I0319 16:41:58.684499 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e9f3598f722deff39a361d126351eb4b726d2156f52cdbf43c17ccb3e58d7e62"} Mar 19 16:41:58 crc kubenswrapper[4918]: I0319 16:41:58.689962 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2f1e15590e6199989ce2072316a9dc333479353137899992f1e3ae6b84c4a484"} Mar 19 16:41:58 crc kubenswrapper[4918]: I0319 16:41:58.718581 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.718555846 podStartE2EDuration="718.555846ms" podCreationTimestamp="2026-03-19 16:41:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:41:58.714545917 +0000 UTC m=+130.836745155" watchObservedRunningTime="2026-03-19 16:41:58.718555846 +0000 UTC m=+130.840755094" Mar 19 16:41:58 crc kubenswrapper[4918]: I0319 16:41:58.797027 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:58 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:58 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:58 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:58 crc kubenswrapper[4918]: I0319 16:41:58.797106 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:41:58 crc kubenswrapper[4918]: I0319 16:41:58.821004 4918 ???:1] "http: TLS handshake error from 192.168.126.11:37754: no serving certificate available for the kubelet" Mar 19 16:41:59 crc kubenswrapper[4918]: I0319 16:41:59.494928 4918 ???:1] "http: TLS handshake error from 192.168.126.11:37768: no serving certificate available for the kubelet" Mar 19 16:41:59 crc kubenswrapper[4918]: I0319 16:41:59.775076 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:41:59 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:41:59 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:41:59 crc kubenswrapper[4918]: healthz check failed Mar 19 16:41:59 crc kubenswrapper[4918]: I0319 16:41:59.775189 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.133689 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565642-wjgcp"] Mar 19 16:42:00 crc kubenswrapper[4918]: E0319 16:42:00.134077 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0" containerName="collect-profiles" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.134100 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0" containerName="collect-profiles" Mar 19 16:42:00 crc kubenswrapper[4918]: E0319 16:42:00.134109 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1344358-7a37-4064-8781-a46f8a2fcef1" containerName="pruner" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.134117 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1344358-7a37-4064-8781-a46f8a2fcef1" containerName="pruner" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.134284 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0" containerName="collect-profiles" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.134298 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1344358-7a37-4064-8781-a46f8a2fcef1" containerName="pruner" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.134884 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565642-wjgcp" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.191966 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.192362 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.192987 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.210839 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565642-wjgcp"] Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.253797 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q6t2\" (UniqueName: \"kubernetes.io/projected/490c710f-78b8-41a4-b4bc-4eeffdde7a5d-kube-api-access-9q6t2\") pod \"auto-csr-approver-29565642-wjgcp\" (UID: \"490c710f-78b8-41a4-b4bc-4eeffdde7a5d\") " pod="openshift-infra/auto-csr-approver-29565642-wjgcp" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.355654 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q6t2\" (UniqueName: \"kubernetes.io/projected/490c710f-78b8-41a4-b4bc-4eeffdde7a5d-kube-api-access-9q6t2\") pod \"auto-csr-approver-29565642-wjgcp\" (UID: \"490c710f-78b8-41a4-b4bc-4eeffdde7a5d\") " pod="openshift-infra/auto-csr-approver-29565642-wjgcp" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.390727 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q6t2\" (UniqueName: \"kubernetes.io/projected/490c710f-78b8-41a4-b4bc-4eeffdde7a5d-kube-api-access-9q6t2\") pod \"auto-csr-approver-29565642-wjgcp\" (UID: \"490c710f-78b8-41a4-b4bc-4eeffdde7a5d\") " pod="openshift-infra/auto-csr-approver-29565642-wjgcp" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.495231 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565642-wjgcp" Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.770974 4918 patch_prober.go:28] interesting pod/router-default-5444994796-4rm5n container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 16:42:00 crc kubenswrapper[4918]: [-]has-synced failed: reason withheld Mar 19 16:42:00 crc kubenswrapper[4918]: [+]process-running ok Mar 19 16:42:00 crc kubenswrapper[4918]: healthz check failed Mar 19 16:42:00 crc kubenswrapper[4918]: I0319 16:42:00.771052 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4rm5n" podUID="ca6c2a92-4376-4b9b-9c73-c29ee0d09082" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 16:42:01 crc kubenswrapper[4918]: I0319 16:42:01.774679 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:42:01 crc kubenswrapper[4918]: I0319 16:42:01.778555 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-4rm5n" Mar 19 16:42:01 crc kubenswrapper[4918]: I0319 16:42:01.987404 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-pz4gm" Mar 19 16:42:02 crc kubenswrapper[4918]: I0319 16:42:02.011884 4918 patch_prober.go:28] interesting pod/console-f9d7485db-td7k5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 19 16:42:02 crc kubenswrapper[4918]: I0319 16:42:02.011945 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-td7k5" podUID="5189c318-e4b1-4dd9-9a6d-284425d319cf" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 19 16:42:03 crc kubenswrapper[4918]: E0319 16:42:03.300255 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:42:03 crc kubenswrapper[4918]: E0319 16:42:03.305537 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:42:03 crc kubenswrapper[4918]: E0319 16:42:03.315318 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:42:03 crc kubenswrapper[4918]: E0319 16:42:03.315381 4918 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" podUID="6756a1df-386b-4ee8-954b-bb4ae3829e58" containerName="kube-multus-additional-cni-plugins" Mar 19 16:42:04 crc kubenswrapper[4918]: I0319 16:42:04.983390 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77bdcb48b-dwq98"] Mar 19 16:42:04 crc kubenswrapper[4918]: I0319 16:42:04.983659 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" podUID="e58452ae-25b8-4d98-a9e1-073317c2b0f3" containerName="controller-manager" containerID="cri-o://eba824459c25f678452869559a3a0a7dc35335a81c079975a89ccff702265305" gracePeriod=30 Mar 19 16:42:05 crc kubenswrapper[4918]: I0319 16:42:05.006141 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2"] Mar 19 16:42:05 crc kubenswrapper[4918]: I0319 16:42:05.006444 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" podUID="10d16695-54c5-4302-9365-7f9f057d2cb5" containerName="route-controller-manager" containerID="cri-o://e79be5749a748cc7db23359a4221ac6a9097fcee78f378528c319b4bc797bd7b" gracePeriod=30 Mar 19 16:42:05 crc kubenswrapper[4918]: I0319 16:42:05.798368 4918 generic.go:334] "Generic (PLEG): container finished" podID="e58452ae-25b8-4d98-a9e1-073317c2b0f3" containerID="eba824459c25f678452869559a3a0a7dc35335a81c079975a89ccff702265305" exitCode=0 Mar 19 16:42:05 crc kubenswrapper[4918]: I0319 16:42:05.798594 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" event={"ID":"e58452ae-25b8-4d98-a9e1-073317c2b0f3","Type":"ContainerDied","Data":"eba824459c25f678452869559a3a0a7dc35335a81c079975a89ccff702265305"} Mar 19 16:42:05 crc kubenswrapper[4918]: I0319 16:42:05.803019 4918 generic.go:334] "Generic (PLEG): container finished" podID="10d16695-54c5-4302-9365-7f9f057d2cb5" containerID="e79be5749a748cc7db23359a4221ac6a9097fcee78f378528c319b4bc797bd7b" exitCode=0 Mar 19 16:42:05 crc kubenswrapper[4918]: I0319 16:42:05.803073 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" event={"ID":"10d16695-54c5-4302-9365-7f9f057d2cb5","Type":"ContainerDied","Data":"e79be5749a748cc7db23359a4221ac6a9097fcee78f378528c319b4bc797bd7b"} Mar 19 16:42:09 crc kubenswrapper[4918]: I0319 16:42:09.636428 4918 patch_prober.go:28] interesting pod/route-controller-manager-58bbccc84d-dxks2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" start-of-body= Mar 19 16:42:09 crc kubenswrapper[4918]: I0319 16:42:09.636888 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" podUID="10d16695-54c5-4302-9365-7f9f057d2cb5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" Mar 19 16:42:11 crc kubenswrapper[4918]: I0319 16:42:11.408361 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:42:11 crc kubenswrapper[4918]: I0319 16:42:11.600912 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 19 16:42:12 crc kubenswrapper[4918]: I0319 16:42:12.009963 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:42:12 crc kubenswrapper[4918]: I0319 16:42:12.013756 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:42:12 crc kubenswrapper[4918]: I0319 16:42:12.080131 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.080105579 podStartE2EDuration="1.080105579s" podCreationTimestamp="2026-03-19 16:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:42:12.049864289 +0000 UTC m=+144.172063537" watchObservedRunningTime="2026-03-19 16:42:12.080105579 +0000 UTC m=+144.202304827" Mar 19 16:42:12 crc kubenswrapper[4918]: I0319 16:42:12.218938 4918 patch_prober.go:28] interesting pod/controller-manager-77bdcb48b-dwq98 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 16:42:12 crc kubenswrapper[4918]: I0319 16:42:12.219019 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" podUID="e58452ae-25b8-4d98-a9e1-073317c2b0f3" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 16:42:13 crc kubenswrapper[4918]: E0319 16:42:13.295436 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:42:13 crc kubenswrapper[4918]: E0319 16:42:13.298247 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:42:13 crc kubenswrapper[4918]: E0319 16:42:13.300373 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:42:13 crc kubenswrapper[4918]: E0319 16:42:13.300436 4918 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" podUID="6756a1df-386b-4ee8-954b-bb4ae3829e58" containerName="kube-multus-additional-cni-plugins" Mar 19 16:42:15 crc kubenswrapper[4918]: I0319 16:42:15.739267 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 16:42:15 crc kubenswrapper[4918]: I0319 16:42:15.811088 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kube-api-access\") pod \"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9\" (UID: \"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9\") " Mar 19 16:42:15 crc kubenswrapper[4918]: I0319 16:42:15.811508 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kubelet-dir\") pod \"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9\" (UID: \"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9\") " Mar 19 16:42:15 crc kubenswrapper[4918]: I0319 16:42:15.811688 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "67cbf560-f84a-45a7-8ab5-4ba310f9a4c9" (UID: "67cbf560-f84a-45a7-8ab5-4ba310f9a4c9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:42:15 crc kubenswrapper[4918]: I0319 16:42:15.818821 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "67cbf560-f84a-45a7-8ab5-4ba310f9a4c9" (UID: "67cbf560-f84a-45a7-8ab5-4ba310f9a4c9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:15 crc kubenswrapper[4918]: I0319 16:42:15.900549 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"67cbf560-f84a-45a7-8ab5-4ba310f9a4c9","Type":"ContainerDied","Data":"099fa5a547d46cbfc79b050eb0a2941ce44d8a8a74f9409d86bdbd2eaa32eb22"} Mar 19 16:42:15 crc kubenswrapper[4918]: I0319 16:42:15.900610 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="099fa5a547d46cbfc79b050eb0a2941ce44d8a8a74f9409d86bdbd2eaa32eb22" Mar 19 16:42:15 crc kubenswrapper[4918]: I0319 16:42:15.900590 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 16:42:15 crc kubenswrapper[4918]: I0319 16:42:15.915201 4918 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:15 crc kubenswrapper[4918]: I0319 16:42:15.915251 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67cbf560-f84a-45a7-8ab5-4ba310f9a4c9-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:17 crc kubenswrapper[4918]: I0319 16:42:17.609857 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.019887 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.037446 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.037423811 podStartE2EDuration="1.037423811s" podCreationTimestamp="2026-03-19 16:42:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:42:18.035411205 +0000 UTC m=+150.157610503" watchObservedRunningTime="2026-03-19 16:42:18.037423811 +0000 UTC m=+150.159623069" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.071752 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f7546f76c-c92tn"] Mar 19 16:42:18 crc kubenswrapper[4918]: E0319 16:42:18.072101 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e58452ae-25b8-4d98-a9e1-073317c2b0f3" containerName="controller-manager" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.072120 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58452ae-25b8-4d98-a9e1-073317c2b0f3" containerName="controller-manager" Mar 19 16:42:18 crc kubenswrapper[4918]: E0319 16:42:18.072139 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67cbf560-f84a-45a7-8ab5-4ba310f9a4c9" containerName="pruner" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.072147 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cbf560-f84a-45a7-8ab5-4ba310f9a4c9" containerName="pruner" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.072280 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="67cbf560-f84a-45a7-8ab5-4ba310f9a4c9" containerName="pruner" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.072299 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e58452ae-25b8-4d98-a9e1-073317c2b0f3" containerName="controller-manager" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.072814 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.086968 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7546f76c-c92tn"] Mar 19 16:42:18 crc kubenswrapper[4918]: E0319 16:42:18.133373 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 16:42:18 crc kubenswrapper[4918]: E0319 16:42:18.133647 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5s2tf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6hdb4_openshift-marketplace(ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 16:42:18 crc kubenswrapper[4918]: E0319 16:42:18.134879 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6hdb4" podUID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.150052 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-proxy-ca-bundles\") pod \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.150131 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-config\") pod \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.150198 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e58452ae-25b8-4d98-a9e1-073317c2b0f3-serving-cert\") pod \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.150217 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-client-ca\") pod \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.150243 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnzkv\" (UniqueName: \"kubernetes.io/projected/e58452ae-25b8-4d98-a9e1-073317c2b0f3-kube-api-access-rnzkv\") pod \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\" (UID: \"e58452ae-25b8-4d98-a9e1-073317c2b0f3\") " Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.150503 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/480c4634-3020-4278-89d8-2ae564002765-serving-cert\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.150566 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-config\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.150607 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrl7k\" (UniqueName: \"kubernetes.io/projected/480c4634-3020-4278-89d8-2ae564002765-kube-api-access-nrl7k\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.150720 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-client-ca\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.150885 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-proxy-ca-bundles\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.151720 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e58452ae-25b8-4d98-a9e1-073317c2b0f3" (UID: "e58452ae-25b8-4d98-a9e1-073317c2b0f3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.151873 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-client-ca" (OuterVolumeSpecName: "client-ca") pod "e58452ae-25b8-4d98-a9e1-073317c2b0f3" (UID: "e58452ae-25b8-4d98-a9e1-073317c2b0f3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.151899 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-config" (OuterVolumeSpecName: "config") pod "e58452ae-25b8-4d98-a9e1-073317c2b0f3" (UID: "e58452ae-25b8-4d98-a9e1-073317c2b0f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.155467 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e58452ae-25b8-4d98-a9e1-073317c2b0f3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e58452ae-25b8-4d98-a9e1-073317c2b0f3" (UID: "e58452ae-25b8-4d98-a9e1-073317c2b0f3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.174691 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e58452ae-25b8-4d98-a9e1-073317c2b0f3-kube-api-access-rnzkv" (OuterVolumeSpecName: "kube-api-access-rnzkv") pod "e58452ae-25b8-4d98-a9e1-073317c2b0f3" (UID: "e58452ae-25b8-4d98-a9e1-073317c2b0f3"). InnerVolumeSpecName "kube-api-access-rnzkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.252247 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-client-ca\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.252310 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-proxy-ca-bundles\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.252353 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/480c4634-3020-4278-89d8-2ae564002765-serving-cert\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.252375 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-config\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.252400 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrl7k\" (UniqueName: \"kubernetes.io/projected/480c4634-3020-4278-89d8-2ae564002765-kube-api-access-nrl7k\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.252452 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.252461 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e58452ae-25b8-4d98-a9e1-073317c2b0f3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.252487 4918 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.252498 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnzkv\" (UniqueName: \"kubernetes.io/projected/e58452ae-25b8-4d98-a9e1-073317c2b0f3-kube-api-access-rnzkv\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.252507 4918 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e58452ae-25b8-4d98-a9e1-073317c2b0f3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.253840 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-client-ca\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.254318 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-proxy-ca-bundles\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.254490 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-config\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.257214 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/480c4634-3020-4278-89d8-2ae564002765-serving-cert\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.285149 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrl7k\" (UniqueName: \"kubernetes.io/projected/480c4634-3020-4278-89d8-2ae564002765-kube-api-access-nrl7k\") pod \"controller-manager-7f7546f76c-c92tn\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.405853 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.926127 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" event={"ID":"e58452ae-25b8-4d98-a9e1-073317c2b0f3","Type":"ContainerDied","Data":"7133f861ef9d1ca6ea7024be3ed391df432cedb9d0a8c4d21a7b8e46df452efd"} Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.926231 4918 scope.go:117] "RemoveContainer" containerID="eba824459c25f678452869559a3a0a7dc35335a81c079975a89ccff702265305" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.926776 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bdcb48b-dwq98" Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.979654 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77bdcb48b-dwq98"] Mar 19 16:42:18 crc kubenswrapper[4918]: I0319 16:42:18.985715 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77bdcb48b-dwq98"] Mar 19 16:42:19 crc kubenswrapper[4918]: I0319 16:42:19.942874 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-cvm4l_6756a1df-386b-4ee8-954b-bb4ae3829e58/kube-multus-additional-cni-plugins/0.log" Mar 19 16:42:19 crc kubenswrapper[4918]: I0319 16:42:19.943319 4918 generic.go:334] "Generic (PLEG): container finished" podID="6756a1df-386b-4ee8-954b-bb4ae3829e58" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" exitCode=137 Mar 19 16:42:19 crc kubenswrapper[4918]: I0319 16:42:19.943386 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" event={"ID":"6756a1df-386b-4ee8-954b-bb4ae3829e58","Type":"ContainerDied","Data":"d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93"} Mar 19 16:42:20 crc kubenswrapper[4918]: I0319 16:42:20.002714 4918 ???:1] "http: TLS handshake error from 192.168.126.11:42616: no serving certificate available for the kubelet" Mar 19 16:42:20 crc kubenswrapper[4918]: I0319 16:42:20.596623 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e58452ae-25b8-4d98-a9e1-073317c2b0f3" path="/var/lib/kubelet/pods/e58452ae-25b8-4d98-a9e1-073317c2b0f3/volumes" Mar 19 16:42:20 crc kubenswrapper[4918]: I0319 16:42:20.635769 4918 patch_prober.go:28] interesting pod/route-controller-manager-58bbccc84d-dxks2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 16:42:20 crc kubenswrapper[4918]: I0319 16:42:20.635838 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" podUID="10d16695-54c5-4302-9365-7f9f057d2cb5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 16:42:22 crc kubenswrapper[4918]: I0319 16:42:22.892443 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gl4jw" Mar 19 16:42:23 crc kubenswrapper[4918]: E0319 16:42:22.999852 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6hdb4" podUID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" Mar 19 16:42:23 crc kubenswrapper[4918]: E0319 16:42:23.290019 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93 is running failed: container process not found" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:42:23 crc kubenswrapper[4918]: E0319 16:42:23.290333 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93 is running failed: container process not found" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:42:23 crc kubenswrapper[4918]: E0319 16:42:23.290825 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93 is running failed: container process not found" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 16:42:23 crc kubenswrapper[4918]: E0319 16:42:23.290861 4918 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" podUID="6756a1df-386b-4ee8-954b-bb4ae3829e58" containerName="kube-multus-additional-cni-plugins" Mar 19 16:42:24 crc kubenswrapper[4918]: E0319 16:42:24.447513 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 16:42:24 crc kubenswrapper[4918]: E0319 16:42:24.448398 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p5hqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-t4mc2_openshift-marketplace(6c06d493-b3ec-42b0-9050-48e45aa277fe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 16:42:24 crc kubenswrapper[4918]: E0319 16:42:24.449656 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-t4mc2" podUID="6c06d493-b3ec-42b0-9050-48e45aa277fe" Mar 19 16:42:24 crc kubenswrapper[4918]: I0319 16:42:24.989908 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f7546f76c-c92tn"] Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.356717 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-t4mc2" podUID="6c06d493-b3ec-42b0-9050-48e45aa277fe" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.443272 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.451491 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-cvm4l_6756a1df-386b-4ee8-954b-bb4ae3829e58/kube-multus-additional-cni-plugins/0.log" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.451582 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.454855 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.455038 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rmx82,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5nsgc_openshift-marketplace(8824182c-653f-4719-87ac-38d3c9c44f12): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.456358 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5nsgc" podUID="8824182c-653f-4719-87ac-38d3c9c44f12" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.501983 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h"] Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.502195 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d16695-54c5-4302-9365-7f9f057d2cb5" containerName="route-controller-manager" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.502208 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d16695-54c5-4302-9365-7f9f057d2cb5" containerName="route-controller-manager" Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.502227 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6756a1df-386b-4ee8-954b-bb4ae3829e58" containerName="kube-multus-additional-cni-plugins" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.502232 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6756a1df-386b-4ee8-954b-bb4ae3829e58" containerName="kube-multus-additional-cni-plugins" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.502336 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="6756a1df-386b-4ee8-954b-bb4ae3829e58" containerName="kube-multus-additional-cni-plugins" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.502348 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d16695-54c5-4302-9365-7f9f057d2cb5" containerName="route-controller-manager" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.502725 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.507944 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h"] Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.511836 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.512054 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dnjw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dh9m5_openshift-marketplace(83708522-86b5-47d3-9f69-3bb7a645bb39): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.513881 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dh9m5" podUID="83708522-86b5-47d3-9f69-3bb7a645bb39" Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.546506 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.546723 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t8zbj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-s2zbj_openshift-marketplace(66b9142f-4eaf-41a0-9b13-dae083686eec): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 16:42:26 crc kubenswrapper[4918]: E0319 16:42:26.548085 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-s2zbj" podUID="66b9142f-4eaf-41a0-9b13-dae083686eec" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.585168 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.586737 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.589406 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.589695 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590093 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-client-ca\") pod \"10d16695-54c5-4302-9365-7f9f057d2cb5\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590153 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-config\") pod \"10d16695-54c5-4302-9365-7f9f057d2cb5\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590189 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xz4k\" (UniqueName: \"kubernetes.io/projected/10d16695-54c5-4302-9365-7f9f057d2cb5-kube-api-access-9xz4k\") pod \"10d16695-54c5-4302-9365-7f9f057d2cb5\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590222 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6756a1df-386b-4ee8-954b-bb4ae3829e58-ready\") pod \"6756a1df-386b-4ee8-954b-bb4ae3829e58\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590241 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10d16695-54c5-4302-9365-7f9f057d2cb5-serving-cert\") pod \"10d16695-54c5-4302-9365-7f9f057d2cb5\" (UID: \"10d16695-54c5-4302-9365-7f9f057d2cb5\") " Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590258 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6756a1df-386b-4ee8-954b-bb4ae3829e58-tuning-conf-dir\") pod \"6756a1df-386b-4ee8-954b-bb4ae3829e58\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590284 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6756a1df-386b-4ee8-954b-bb4ae3829e58-cni-sysctl-allowlist\") pod \"6756a1df-386b-4ee8-954b-bb4ae3829e58\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590321 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pddg\" (UniqueName: \"kubernetes.io/projected/6756a1df-386b-4ee8-954b-bb4ae3829e58-kube-api-access-8pddg\") pod \"6756a1df-386b-4ee8-954b-bb4ae3829e58\" (UID: \"6756a1df-386b-4ee8-954b-bb4ae3829e58\") " Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590499 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-client-ca\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590542 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-config\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590570 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlxx5\" (UniqueName: \"kubernetes.io/projected/9df7a3f3-81e4-4990-8e48-bee61f27c35e-kube-api-access-tlxx5\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.590593 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9df7a3f3-81e4-4990-8e48-bee61f27c35e-serving-cert\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.591298 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6756a1df-386b-4ee8-954b-bb4ae3829e58-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "6756a1df-386b-4ee8-954b-bb4ae3829e58" (UID: "6756a1df-386b-4ee8-954b-bb4ae3829e58"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.591887 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-client-ca" (OuterVolumeSpecName: "client-ca") pod "10d16695-54c5-4302-9365-7f9f057d2cb5" (UID: "10d16695-54c5-4302-9365-7f9f057d2cb5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.593064 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6756a1df-386b-4ee8-954b-bb4ae3829e58-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "6756a1df-386b-4ee8-954b-bb4ae3829e58" (UID: "6756a1df-386b-4ee8-954b-bb4ae3829e58"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.598049 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-config" (OuterVolumeSpecName: "config") pod "10d16695-54c5-4302-9365-7f9f057d2cb5" (UID: "10d16695-54c5-4302-9365-7f9f057d2cb5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.598410 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6756a1df-386b-4ee8-954b-bb4ae3829e58-ready" (OuterVolumeSpecName: "ready") pod "6756a1df-386b-4ee8-954b-bb4ae3829e58" (UID: "6756a1df-386b-4ee8-954b-bb4ae3829e58"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.601483 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d16695-54c5-4302-9365-7f9f057d2cb5-kube-api-access-9xz4k" (OuterVolumeSpecName: "kube-api-access-9xz4k") pod "10d16695-54c5-4302-9365-7f9f057d2cb5" (UID: "10d16695-54c5-4302-9365-7f9f057d2cb5"). InnerVolumeSpecName "kube-api-access-9xz4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.602687 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d16695-54c5-4302-9365-7f9f057d2cb5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "10d16695-54c5-4302-9365-7f9f057d2cb5" (UID: "10d16695-54c5-4302-9365-7f9f057d2cb5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.605448 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6756a1df-386b-4ee8-954b-bb4ae3829e58-kube-api-access-8pddg" (OuterVolumeSpecName: "kube-api-access-8pddg") pod "6756a1df-386b-4ee8-954b-bb4ae3829e58" (UID: "6756a1df-386b-4ee8-954b-bb4ae3829e58"). InnerVolumeSpecName "kube-api-access-8pddg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.615959 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.691728 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f8251e3-8622-4cc4-985f-dcb9354e714c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3f8251e3-8622-4cc4-985f-dcb9354e714c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.691841 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-client-ca\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.691904 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-config\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.691973 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlxx5\" (UniqueName: \"kubernetes.io/projected/9df7a3f3-81e4-4990-8e48-bee61f27c35e-kube-api-access-tlxx5\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.692047 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9df7a3f3-81e4-4990-8e48-bee61f27c35e-serving-cert\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.692098 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f8251e3-8622-4cc4-985f-dcb9354e714c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3f8251e3-8622-4cc4-985f-dcb9354e714c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.692759 4918 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6756a1df-386b-4ee8-954b-bb4ae3829e58-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.692789 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pddg\" (UniqueName: \"kubernetes.io/projected/6756a1df-386b-4ee8-954b-bb4ae3829e58-kube-api-access-8pddg\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.692874 4918 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.692886 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10d16695-54c5-4302-9365-7f9f057d2cb5-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.692943 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xz4k\" (UniqueName: \"kubernetes.io/projected/10d16695-54c5-4302-9365-7f9f057d2cb5-kube-api-access-9xz4k\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.692957 4918 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6756a1df-386b-4ee8-954b-bb4ae3829e58-ready\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.692970 4918 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6756a1df-386b-4ee8-954b-bb4ae3829e58-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.692981 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10d16695-54c5-4302-9365-7f9f057d2cb5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.694269 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-client-ca\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.695216 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-config\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.701755 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9df7a3f3-81e4-4990-8e48-bee61f27c35e-serving-cert\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.711630 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlxx5\" (UniqueName: \"kubernetes.io/projected/9df7a3f3-81e4-4990-8e48-bee61f27c35e-kube-api-access-tlxx5\") pod \"route-controller-manager-68d4f88c86-2n62h\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.794555 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f8251e3-8622-4cc4-985f-dcb9354e714c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3f8251e3-8622-4cc4-985f-dcb9354e714c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.794947 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f8251e3-8622-4cc4-985f-dcb9354e714c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3f8251e3-8622-4cc4-985f-dcb9354e714c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.794708 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f8251e3-8622-4cc4-985f-dcb9354e714c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3f8251e3-8622-4cc4-985f-dcb9354e714c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.810958 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f8251e3-8622-4cc4-985f-dcb9354e714c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3f8251e3-8622-4cc4-985f-dcb9354e714c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.826301 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.851810 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565642-wjgcp"] Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.901316 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f7546f76c-c92tn"] Mar 19 16:42:26 crc kubenswrapper[4918]: W0319 16:42:26.921780 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod480c4634_3020_4278_89d8_2ae564002765.slice/crio-c305e671bd0ca1854e251bac0c1f47e1272e617fd681881b4a530377cb72e569 WatchSource:0}: Error finding container c305e671bd0ca1854e251bac0c1f47e1272e617fd681881b4a530377cb72e569: Status 404 returned error can't find the container with id c305e671bd0ca1854e251bac0c1f47e1272e617fd681881b4a530377cb72e569 Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.967594 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.997634 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-cvm4l_6756a1df-386b-4ee8-954b-bb4ae3829e58/kube-multus-additional-cni-plugins/0.log" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.997726 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" event={"ID":"6756a1df-386b-4ee8-954b-bb4ae3829e58","Type":"ContainerDied","Data":"21a278c8bef63c30020ecc29e2fc332eded175452bc54840e988f3e75da4b49d"} Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.997787 4918 scope.go:117] "RemoveContainer" containerID="d36a50357ef3c79837144854db44d0048962d2c36307e739aea1213751b9da93" Mar 19 16:42:26 crc kubenswrapper[4918]: I0319 16:42:26.997916 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-cvm4l" Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.003889 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565642-wjgcp" event={"ID":"490c710f-78b8-41a4-b4bc-4eeffdde7a5d","Type":"ContainerStarted","Data":"e884172454888b98c7d653deb0aa3a642ec39be0da96332842347a007bb9f79d"} Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.007609 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" event={"ID":"10d16695-54c5-4302-9365-7f9f057d2cb5","Type":"ContainerDied","Data":"fa0ef346e09bfd23cbc623fd846ecaaa17e4de01b1cbffea4998fd26a3a98edf"} Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.007690 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2" Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.016735 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7glsh" event={"ID":"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa","Type":"ContainerStarted","Data":"cbcbd44531d348d0ec89b670f88d0757c51e794769ee83252578f03ef3c2ba3e"} Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.020163 4918 generic.go:334] "Generic (PLEG): container finished" podID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" containerID="519b68f0b189220a45ad9731d08cfa2f15b7a468d6e2eeab197c19d6e10442ec" exitCode=0 Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.020224 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmjc7" event={"ID":"42c986e2-96e9-4f2c-9b13-8cf09b8d0480","Type":"ContainerDied","Data":"519b68f0b189220a45ad9731d08cfa2f15b7a468d6e2eeab197c19d6e10442ec"} Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.024242 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" event={"ID":"480c4634-3020-4278-89d8-2ae564002765","Type":"ContainerStarted","Data":"c305e671bd0ca1854e251bac0c1f47e1272e617fd681881b4a530377cb72e569"} Mar 19 16:42:27 crc kubenswrapper[4918]: E0319 16:42:27.027900 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5nsgc" podUID="8824182c-653f-4719-87ac-38d3c9c44f12" Mar 19 16:42:27 crc kubenswrapper[4918]: E0319 16:42:27.029386 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-s2zbj" podUID="66b9142f-4eaf-41a0-9b13-dae083686eec" Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.030247 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcmw9" event={"ID":"3044e214-7f52-423c-98a6-03a05ed008a1","Type":"ContainerStarted","Data":"c87f4c1206fb07b4f1360b38faf71bb75a719a809aa3384559ea7e1d6b46d707"} Mar 19 16:42:27 crc kubenswrapper[4918]: E0319 16:42:27.030774 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dh9m5" podUID="83708522-86b5-47d3-9f69-3bb7a645bb39" Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.084484 4918 scope.go:117] "RemoveContainer" containerID="e79be5749a748cc7db23359a4221ac6a9097fcee78f378528c319b4bc797bd7b" Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.191497 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-cvm4l"] Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.196124 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-cvm4l"] Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.201343 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2"] Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.205180 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58bbccc84d-dxks2"] Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.255935 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h"] Mar 19 16:42:27 crc kubenswrapper[4918]: I0319 16:42:27.425457 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 16:42:27 crc kubenswrapper[4918]: W0319 16:42:27.435852 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3f8251e3_8622_4cc4_985f_dcb9354e714c.slice/crio-dff73f5c9daf0d32ae541244dd063b3ed85877750fd1d3d3aec425113592b57c WatchSource:0}: Error finding container dff73f5c9daf0d32ae541244dd063b3ed85877750fd1d3d3aec425113592b57c: Status 404 returned error can't find the container with id dff73f5c9daf0d32ae541244dd063b3ed85877750fd1d3d3aec425113592b57c Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.046952 4918 generic.go:334] "Generic (PLEG): container finished" podID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerID="cbcbd44531d348d0ec89b670f88d0757c51e794769ee83252578f03ef3c2ba3e" exitCode=0 Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.047027 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7glsh" event={"ID":"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa","Type":"ContainerDied","Data":"cbcbd44531d348d0ec89b670f88d0757c51e794769ee83252578f03ef3c2ba3e"} Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.050670 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" podUID="480c4634-3020-4278-89d8-2ae564002765" containerName="controller-manager" containerID="cri-o://16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33" gracePeriod=30 Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.050606 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" event={"ID":"480c4634-3020-4278-89d8-2ae564002765","Type":"ContainerStarted","Data":"16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33"} Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.051102 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.056550 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.061538 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" event={"ID":"9df7a3f3-81e4-4990-8e48-bee61f27c35e","Type":"ContainerStarted","Data":"c3374f7c42008891e6de5a7503769be637630d7c3ceed6015229d62bc90fb5c6"} Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.061575 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.061589 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" event={"ID":"9df7a3f3-81e4-4990-8e48-bee61f27c35e","Type":"ContainerStarted","Data":"fb141cea53c40fb06c54dbbb68676c4e9d200667df79ff98947ab1a16087a09c"} Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.065955 4918 generic.go:334] "Generic (PLEG): container finished" podID="3044e214-7f52-423c-98a6-03a05ed008a1" containerID="c87f4c1206fb07b4f1360b38faf71bb75a719a809aa3384559ea7e1d6b46d707" exitCode=0 Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.066055 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcmw9" event={"ID":"3044e214-7f52-423c-98a6-03a05ed008a1","Type":"ContainerDied","Data":"c87f4c1206fb07b4f1360b38faf71bb75a719a809aa3384559ea7e1d6b46d707"} Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.071414 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3f8251e3-8622-4cc4-985f-dcb9354e714c","Type":"ContainerStarted","Data":"44c5299e3e544f89bce83d2eda27aca67ab3c3afba31284cae544c2a219d4c7c"} Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.071452 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3f8251e3-8622-4cc4-985f-dcb9354e714c","Type":"ContainerStarted","Data":"dff73f5c9daf0d32ae541244dd063b3ed85877750fd1d3d3aec425113592b57c"} Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.103988 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.131309 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" podStartSLOduration=24.131279677 podStartE2EDuration="24.131279677s" podCreationTimestamp="2026-03-19 16:42:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:42:28.102085436 +0000 UTC m=+160.224284694" watchObservedRunningTime="2026-03-19 16:42:28.131279677 +0000 UTC m=+160.253478925" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.150299 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.150281575 podStartE2EDuration="2.150281575s" podCreationTimestamp="2026-03-19 16:42:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:42:28.149222075 +0000 UTC m=+160.271421313" watchObservedRunningTime="2026-03-19 16:42:28.150281575 +0000 UTC m=+160.272480823" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.176877 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" podStartSLOduration=3.176846653 podStartE2EDuration="3.176846653s" podCreationTimestamp="2026-03-19 16:42:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:42:28.173219932 +0000 UTC m=+160.295419180" watchObservedRunningTime="2026-03-19 16:42:28.176846653 +0000 UTC m=+160.299045921" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.540398 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.555704 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-proxy-ca-bundles\") pod \"480c4634-3020-4278-89d8-2ae564002765\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.555764 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-client-ca\") pod \"480c4634-3020-4278-89d8-2ae564002765\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.555859 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrl7k\" (UniqueName: \"kubernetes.io/projected/480c4634-3020-4278-89d8-2ae564002765-kube-api-access-nrl7k\") pod \"480c4634-3020-4278-89d8-2ae564002765\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.556020 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-config\") pod \"480c4634-3020-4278-89d8-2ae564002765\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.556103 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/480c4634-3020-4278-89d8-2ae564002765-serving-cert\") pod \"480c4634-3020-4278-89d8-2ae564002765\" (UID: \"480c4634-3020-4278-89d8-2ae564002765\") " Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.557950 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-client-ca" (OuterVolumeSpecName: "client-ca") pod "480c4634-3020-4278-89d8-2ae564002765" (UID: "480c4634-3020-4278-89d8-2ae564002765"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.558244 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "480c4634-3020-4278-89d8-2ae564002765" (UID: "480c4634-3020-4278-89d8-2ae564002765"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.560483 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-config" (OuterVolumeSpecName: "config") pod "480c4634-3020-4278-89d8-2ae564002765" (UID: "480c4634-3020-4278-89d8-2ae564002765"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.568073 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/480c4634-3020-4278-89d8-2ae564002765-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "480c4634-3020-4278-89d8-2ae564002765" (UID: "480c4634-3020-4278-89d8-2ae564002765"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.569821 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480c4634-3020-4278-89d8-2ae564002765-kube-api-access-nrl7k" (OuterVolumeSpecName: "kube-api-access-nrl7k") pod "480c4634-3020-4278-89d8-2ae564002765" (UID: "480c4634-3020-4278-89d8-2ae564002765"). InnerVolumeSpecName "kube-api-access-nrl7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.602393 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d16695-54c5-4302-9365-7f9f057d2cb5" path="/var/lib/kubelet/pods/10d16695-54c5-4302-9365-7f9f057d2cb5/volumes" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.603316 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6756a1df-386b-4ee8-954b-bb4ae3829e58" path="/var/lib/kubelet/pods/6756a1df-386b-4ee8-954b-bb4ae3829e58/volumes" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.658904 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrl7k\" (UniqueName: \"kubernetes.io/projected/480c4634-3020-4278-89d8-2ae564002765-kube-api-access-nrl7k\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.658962 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.658976 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/480c4634-3020-4278-89d8-2ae564002765-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.658988 4918 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.659000 4918 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/480c4634-3020-4278-89d8-2ae564002765-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.894094 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64d975cccd-4sqkq"] Mar 19 16:42:28 crc kubenswrapper[4918]: E0319 16:42:28.895667 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480c4634-3020-4278-89d8-2ae564002765" containerName="controller-manager" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.895696 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="480c4634-3020-4278-89d8-2ae564002765" containerName="controller-manager" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.895805 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="480c4634-3020-4278-89d8-2ae564002765" containerName="controller-manager" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.896313 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.903754 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64d975cccd-4sqkq"] Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.963469 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-proxy-ca-bundles\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.963570 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-config\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.963623 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-client-ca\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.963653 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56567de2-0b79-4304-95bf-57ba79683a03-serving-cert\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:28 crc kubenswrapper[4918]: I0319 16:42:28.963684 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72vzr\" (UniqueName: \"kubernetes.io/projected/56567de2-0b79-4304-95bf-57ba79683a03-kube-api-access-72vzr\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.065366 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-proxy-ca-bundles\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.065450 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-config\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.065487 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-client-ca\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.065668 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56567de2-0b79-4304-95bf-57ba79683a03-serving-cert\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.067014 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-config\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.067014 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-proxy-ca-bundles\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.067069 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-client-ca\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.067248 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72vzr\" (UniqueName: \"kubernetes.io/projected/56567de2-0b79-4304-95bf-57ba79683a03-kube-api-access-72vzr\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.070919 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56567de2-0b79-4304-95bf-57ba79683a03-serving-cert\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.082685 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72vzr\" (UniqueName: \"kubernetes.io/projected/56567de2-0b79-4304-95bf-57ba79683a03-kube-api-access-72vzr\") pod \"controller-manager-64d975cccd-4sqkq\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.082841 4918 generic.go:334] "Generic (PLEG): container finished" podID="3f8251e3-8622-4cc4-985f-dcb9354e714c" containerID="44c5299e3e544f89bce83d2eda27aca67ab3c3afba31284cae544c2a219d4c7c" exitCode=0 Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.082991 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3f8251e3-8622-4cc4-985f-dcb9354e714c","Type":"ContainerDied","Data":"44c5299e3e544f89bce83d2eda27aca67ab3c3afba31284cae544c2a219d4c7c"} Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.086661 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmjc7" event={"ID":"42c986e2-96e9-4f2c-9b13-8cf09b8d0480","Type":"ContainerStarted","Data":"e61b9081fa54a75e9108f5e42877c55a5519f27f621b6b58e8e67414f0e04491"} Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.088893 4918 generic.go:334] "Generic (PLEG): container finished" podID="480c4634-3020-4278-89d8-2ae564002765" containerID="16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33" exitCode=0 Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.088950 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.088992 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" event={"ID":"480c4634-3020-4278-89d8-2ae564002765","Type":"ContainerDied","Data":"16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33"} Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.089051 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" event={"ID":"480c4634-3020-4278-89d8-2ae564002765","Type":"ContainerDied","Data":"c305e671bd0ca1854e251bac0c1f47e1272e617fd681881b4a530377cb72e569"} Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.089071 4918 scope.go:117] "RemoveContainer" containerID="16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.112403 4918 scope.go:117] "RemoveContainer" containerID="16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33" Mar 19 16:42:29 crc kubenswrapper[4918]: E0319 16:42:29.113220 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33\": container with ID starting with 16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33 not found: ID does not exist" containerID="16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.113338 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33"} err="failed to get container status \"16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33\": rpc error: code = NotFound desc = could not find container \"16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33\": container with ID starting with 16e9737fb9a8697a054d09c1caed6b278ee5a9e4e04062c103c43a70d7b12d33 not found: ID does not exist" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.119052 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gmjc7" podStartSLOduration=2.757840276 podStartE2EDuration="41.118990214s" podCreationTimestamp="2026-03-19 16:41:48 +0000 UTC" firstStartedPulling="2026-03-19 16:41:50.207975951 +0000 UTC m=+122.330175199" lastFinishedPulling="2026-03-19 16:42:28.569125889 +0000 UTC m=+160.691325137" observedRunningTime="2026-03-19 16:42:29.117163642 +0000 UTC m=+161.239362890" watchObservedRunningTime="2026-03-19 16:42:29.118990214 +0000 UTC m=+161.241189462" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.128969 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f7546f76c-c92tn"] Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.131964 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f7546f76c-c92tn"] Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.250786 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.407858 4918 patch_prober.go:28] interesting pod/controller-manager-7f7546f76c-c92tn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.407984 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7f7546f76c-c92tn" podUID="480c4634-3020-4278-89d8-2ae564002765" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 16:42:29 crc kubenswrapper[4918]: I0319 16:42:29.470616 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64d975cccd-4sqkq"] Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.099825 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7glsh" event={"ID":"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa","Type":"ContainerStarted","Data":"9af6448e1c79b7f030c4cfd030c56cfd05da2e4eb805013daafe958d087eac2c"} Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.107468 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" event={"ID":"56567de2-0b79-4304-95bf-57ba79683a03","Type":"ContainerStarted","Data":"10d68c38a10623685f07a39b6ae755a1ca93febe6be01a4d76727db6f1282ff3"} Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.107508 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" event={"ID":"56567de2-0b79-4304-95bf-57ba79683a03","Type":"ContainerStarted","Data":"d2ede8b75284e13935d18147062aff1b887e500a04184a9ab4be5edb290a0602"} Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.107560 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.122458 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.126329 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7glsh" podStartSLOduration=2.746946435 podStartE2EDuration="39.126300124s" podCreationTimestamp="2026-03-19 16:41:51 +0000 UTC" firstStartedPulling="2026-03-19 16:41:53.455320845 +0000 UTC m=+125.577520093" lastFinishedPulling="2026-03-19 16:42:29.834674534 +0000 UTC m=+161.956873782" observedRunningTime="2026-03-19 16:42:30.120270207 +0000 UTC m=+162.242469465" watchObservedRunningTime="2026-03-19 16:42:30.126300124 +0000 UTC m=+162.248499372" Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.142657 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" podStartSLOduration=5.142633928 podStartE2EDuration="5.142633928s" podCreationTimestamp="2026-03-19 16:42:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:42:30.140323604 +0000 UTC m=+162.262522862" watchObservedRunningTime="2026-03-19 16:42:30.142633928 +0000 UTC m=+162.264833176" Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.395839 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.487145 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f8251e3-8622-4cc4-985f-dcb9354e714c-kubelet-dir\") pod \"3f8251e3-8622-4cc4-985f-dcb9354e714c\" (UID: \"3f8251e3-8622-4cc4-985f-dcb9354e714c\") " Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.487232 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f8251e3-8622-4cc4-985f-dcb9354e714c-kube-api-access\") pod \"3f8251e3-8622-4cc4-985f-dcb9354e714c\" (UID: \"3f8251e3-8622-4cc4-985f-dcb9354e714c\") " Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.487349 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f8251e3-8622-4cc4-985f-dcb9354e714c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3f8251e3-8622-4cc4-985f-dcb9354e714c" (UID: "3f8251e3-8622-4cc4-985f-dcb9354e714c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.487733 4918 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f8251e3-8622-4cc4-985f-dcb9354e714c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.509023 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f8251e3-8622-4cc4-985f-dcb9354e714c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3f8251e3-8622-4cc4-985f-dcb9354e714c" (UID: "3f8251e3-8622-4cc4-985f-dcb9354e714c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.588809 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f8251e3-8622-4cc4-985f-dcb9354e714c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:30 crc kubenswrapper[4918]: I0319 16:42:30.607652 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480c4634-3020-4278-89d8-2ae564002765" path="/var/lib/kubelet/pods/480c4634-3020-4278-89d8-2ae564002765/volumes" Mar 19 16:42:31 crc kubenswrapper[4918]: I0319 16:42:31.116542 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3f8251e3-8622-4cc4-985f-dcb9354e714c","Type":"ContainerDied","Data":"dff73f5c9daf0d32ae541244dd063b3ed85877750fd1d3d3aec425113592b57c"} Mar 19 16:42:31 crc kubenswrapper[4918]: I0319 16:42:31.116941 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dff73f5c9daf0d32ae541244dd063b3ed85877750fd1d3d3aec425113592b57c" Mar 19 16:42:31 crc kubenswrapper[4918]: I0319 16:42:31.116740 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 16:42:31 crc kubenswrapper[4918]: I0319 16:42:31.435854 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:42:31 crc kubenswrapper[4918]: I0319 16:42:31.435920 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:42:32 crc kubenswrapper[4918]: I0319 16:42:32.792641 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7glsh" podUID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerName="registry-server" probeResult="failure" output=< Mar 19 16:42:32 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 16:42:32 crc kubenswrapper[4918]: > Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.792287 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 16:42:33 crc kubenswrapper[4918]: E0319 16:42:33.792721 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8251e3-8622-4cc4-985f-dcb9354e714c" containerName="pruner" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.792841 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8251e3-8622-4cc4-985f-dcb9354e714c" containerName="pruner" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.794009 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8251e3-8622-4cc4-985f-dcb9354e714c" containerName="pruner" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.795514 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.812157 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.812457 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.821589 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.831551 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kube-api-access\") pod \"installer-9-crc\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.831616 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.831777 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-var-lock\") pod \"installer-9-crc\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.932729 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-var-lock\") pod \"installer-9-crc\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.932838 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kube-api-access\") pod \"installer-9-crc\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.932872 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-var-lock\") pod \"installer-9-crc\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.932945 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.932894 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:42:33 crc kubenswrapper[4918]: I0319 16:42:33.958582 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kube-api-access\") pod \"installer-9-crc\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:42:34 crc kubenswrapper[4918]: I0319 16:42:34.127347 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:42:36 crc kubenswrapper[4918]: I0319 16:42:36.539607 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 16:42:38 crc kubenswrapper[4918]: I0319 16:42:38.682875 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:42:38 crc kubenswrapper[4918]: I0319 16:42:38.683476 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:42:38 crc kubenswrapper[4918]: I0319 16:42:38.755724 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:42:39 crc kubenswrapper[4918]: I0319 16:42:39.204986 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:42:40 crc kubenswrapper[4918]: I0319 16:42:40.030987 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gmjc7"] Mar 19 16:42:41 crc kubenswrapper[4918]: I0319 16:42:41.168145 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gmjc7" podUID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" containerName="registry-server" containerID="cri-o://e61b9081fa54a75e9108f5e42877c55a5519f27f621b6b58e8e67414f0e04491" gracePeriod=2 Mar 19 16:42:41 crc kubenswrapper[4918]: I0319 16:42:41.483861 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:42:41 crc kubenswrapper[4918]: I0319 16:42:41.526106 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:42:41 crc kubenswrapper[4918]: E0319 16:42:41.534835 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 19 16:42:41 crc kubenswrapper[4918]: E0319 16:42:41.534974 4918 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 16:42:41 crc kubenswrapper[4918]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 19 16:42:41 crc kubenswrapper[4918]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9q6t2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29565642-wjgcp_openshift-infra(490c710f-78b8-41a4-b4bc-4eeffdde7a5d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 19 16:42:41 crc kubenswrapper[4918]: > logger="UnhandledError" Mar 19 16:42:41 crc kubenswrapper[4918]: E0319 16:42:41.536137 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29565642-wjgcp" podUID="490c710f-78b8-41a4-b4bc-4eeffdde7a5d" Mar 19 16:42:41 crc kubenswrapper[4918]: I0319 16:42:41.623867 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 16:42:41 crc kubenswrapper[4918]: W0319 16:42:41.641018 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1a36234d_cdf5_47a0_a4dd_405a166c6ff7.slice/crio-092c0ef0c5b8f14701b59133032a8f357e03efc91aa1af1979ebcb59b799b326 WatchSource:0}: Error finding container 092c0ef0c5b8f14701b59133032a8f357e03efc91aa1af1979ebcb59b799b326: Status 404 returned error can't find the container with id 092c0ef0c5b8f14701b59133032a8f357e03efc91aa1af1979ebcb59b799b326 Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.185811 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcmw9" event={"ID":"3044e214-7f52-423c-98a6-03a05ed008a1","Type":"ContainerStarted","Data":"e9fe3eb56d309db503acd4817def0be0efeb8d8d1a1463333f4a86bc82b3d90c"} Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.190567 4918 generic.go:334] "Generic (PLEG): container finished" podID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" containerID="e61b9081fa54a75e9108f5e42877c55a5519f27f621b6b58e8e67414f0e04491" exitCode=0 Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.190755 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmjc7" event={"ID":"42c986e2-96e9-4f2c-9b13-8cf09b8d0480","Type":"ContainerDied","Data":"e61b9081fa54a75e9108f5e42877c55a5519f27f621b6b58e8e67414f0e04491"} Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.194612 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1a36234d-cdf5-47a0-a4dd-405a166c6ff7","Type":"ContainerStarted","Data":"092c0ef0c5b8f14701b59133032a8f357e03efc91aa1af1979ebcb59b799b326"} Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.218661 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rcmw9" podStartSLOduration=2.642913061 podStartE2EDuration="51.218637125s" podCreationTimestamp="2026-03-19 16:41:51 +0000 UTC" firstStartedPulling="2026-03-19 16:41:53.446329779 +0000 UTC m=+125.568529027" lastFinishedPulling="2026-03-19 16:42:42.022053843 +0000 UTC m=+174.144253091" observedRunningTime="2026-03-19 16:42:42.2127209 +0000 UTC m=+174.334920148" watchObservedRunningTime="2026-03-19 16:42:42.218637125 +0000 UTC m=+174.340836373" Mar 19 16:42:42 crc kubenswrapper[4918]: E0319 16:42:42.265847 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29565642-wjgcp" podUID="490c710f-78b8-41a4-b4bc-4eeffdde7a5d" Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.382132 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.559066 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljjpr\" (UniqueName: \"kubernetes.io/projected/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-kube-api-access-ljjpr\") pod \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.559152 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-catalog-content\") pod \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.559238 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-utilities\") pod \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\" (UID: \"42c986e2-96e9-4f2c-9b13-8cf09b8d0480\") " Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.560225 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-utilities" (OuterVolumeSpecName: "utilities") pod "42c986e2-96e9-4f2c-9b13-8cf09b8d0480" (UID: "42c986e2-96e9-4f2c-9b13-8cf09b8d0480"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.567773 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-kube-api-access-ljjpr" (OuterVolumeSpecName: "kube-api-access-ljjpr") pod "42c986e2-96e9-4f2c-9b13-8cf09b8d0480" (UID: "42c986e2-96e9-4f2c-9b13-8cf09b8d0480"). InnerVolumeSpecName "kube-api-access-ljjpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.632808 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42c986e2-96e9-4f2c-9b13-8cf09b8d0480" (UID: "42c986e2-96e9-4f2c-9b13-8cf09b8d0480"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.661196 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljjpr\" (UniqueName: \"kubernetes.io/projected/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-kube-api-access-ljjpr\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.661242 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:42 crc kubenswrapper[4918]: I0319 16:42:42.661257 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c986e2-96e9-4f2c-9b13-8cf09b8d0480-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.202409 4918 generic.go:334] "Generic (PLEG): container finished" podID="6c06d493-b3ec-42b0-9050-48e45aa277fe" containerID="dfce7df5aecdd20f405cd6b86bf95b97a94f9ed3b611f2304e04f2212030b13f" exitCode=0 Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.202452 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4mc2" event={"ID":"6c06d493-b3ec-42b0-9050-48e45aa277fe","Type":"ContainerDied","Data":"dfce7df5aecdd20f405cd6b86bf95b97a94f9ed3b611f2304e04f2212030b13f"} Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.204178 4918 generic.go:334] "Generic (PLEG): container finished" podID="8824182c-653f-4719-87ac-38d3c9c44f12" containerID="edf7ebf7859a002a68c93d6a3bfd5bab07826c0586181fde7d9359e9baa37df7" exitCode=0 Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.204219 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nsgc" event={"ID":"8824182c-653f-4719-87ac-38d3c9c44f12","Type":"ContainerDied","Data":"edf7ebf7859a002a68c93d6a3bfd5bab07826c0586181fde7d9359e9baa37df7"} Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.206400 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1a36234d-cdf5-47a0-a4dd-405a166c6ff7","Type":"ContainerStarted","Data":"1154601c65417a9009d202880635f2af6174b1ac93dba6f8403412c41c0a7800"} Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.209645 4918 generic.go:334] "Generic (PLEG): container finished" podID="83708522-86b5-47d3-9f69-3bb7a645bb39" containerID="10ab177fc8d1880aa08bc520c78b4797f73ad5340b1682f2da3aeeabd153fbab" exitCode=0 Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.209699 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh9m5" event={"ID":"83708522-86b5-47d3-9f69-3bb7a645bb39","Type":"ContainerDied","Data":"10ab177fc8d1880aa08bc520c78b4797f73ad5340b1682f2da3aeeabd153fbab"} Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.212989 4918 generic.go:334] "Generic (PLEG): container finished" podID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" containerID="a1ae6d021391564a1ef8fbe0f610d4d4f2fcbefeb1a37d5931cedac59097e0af" exitCode=0 Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.213078 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hdb4" event={"ID":"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4","Type":"ContainerDied","Data":"a1ae6d021391564a1ef8fbe0f610d4d4f2fcbefeb1a37d5931cedac59097e0af"} Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.218095 4918 generic.go:334] "Generic (PLEG): container finished" podID="66b9142f-4eaf-41a0-9b13-dae083686eec" containerID="d97c9d4c710a4bc3bcdbcc26cccb937b22c6465057dfe074a222714e46e11dad" exitCode=0 Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.218176 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2zbj" event={"ID":"66b9142f-4eaf-41a0-9b13-dae083686eec","Type":"ContainerDied","Data":"d97c9d4c710a4bc3bcdbcc26cccb937b22c6465057dfe074a222714e46e11dad"} Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.224726 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmjc7" event={"ID":"42c986e2-96e9-4f2c-9b13-8cf09b8d0480","Type":"ContainerDied","Data":"47b5d37bad3be84f8ef9d7c9d7badb34880879d7dd5e2dae1834d15d00ad3fe8"} Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.224776 4918 scope.go:117] "RemoveContainer" containerID="e61b9081fa54a75e9108f5e42877c55a5519f27f621b6b58e8e67414f0e04491" Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.224915 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmjc7" Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.267701 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=10.267675965 podStartE2EDuration="10.267675965s" podCreationTimestamp="2026-03-19 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:42:43.260161775 +0000 UTC m=+175.382361043" watchObservedRunningTime="2026-03-19 16:42:43.267675965 +0000 UTC m=+175.389875233" Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.269973 4918 scope.go:117] "RemoveContainer" containerID="519b68f0b189220a45ad9731d08cfa2f15b7a468d6e2eeab197c19d6e10442ec" Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.304087 4918 scope.go:117] "RemoveContainer" containerID="cb7cbd1eebe92aa1eed60d4a64625db22f9a495da4e5b2f4d7c41bbd6a9549e1" Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.348610 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gmjc7"] Mar 19 16:42:43 crc kubenswrapper[4918]: I0319 16:42:43.352726 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gmjc7"] Mar 19 16:42:44 crc kubenswrapper[4918]: I0319 16:42:44.230445 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4mc2" event={"ID":"6c06d493-b3ec-42b0-9050-48e45aa277fe","Type":"ContainerStarted","Data":"d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a"} Mar 19 16:42:44 crc kubenswrapper[4918]: I0319 16:42:44.233234 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nsgc" event={"ID":"8824182c-653f-4719-87ac-38d3c9c44f12","Type":"ContainerStarted","Data":"0a0030c412194e20579b75921e05df9f5dde86ed036595c8129758e3db03584e"} Mar 19 16:42:44 crc kubenswrapper[4918]: I0319 16:42:44.236828 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh9m5" event={"ID":"83708522-86b5-47d3-9f69-3bb7a645bb39","Type":"ContainerStarted","Data":"8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f"} Mar 19 16:42:44 crc kubenswrapper[4918]: I0319 16:42:44.238734 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hdb4" event={"ID":"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4","Type":"ContainerStarted","Data":"9c27a12e790321ff49c5ea34c3a756d01ed13ff046850d1f89a596fbdfc236f7"} Mar 19 16:42:44 crc kubenswrapper[4918]: I0319 16:42:44.240717 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2zbj" event={"ID":"66b9142f-4eaf-41a0-9b13-dae083686eec","Type":"ContainerStarted","Data":"ad5ea550adeccefa0e734a856a4077011856223de2e08f25efa1caa9cc492bb3"} Mar 19 16:42:44 crc kubenswrapper[4918]: I0319 16:42:44.251575 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t4mc2" podStartSLOduration=1.776844579 podStartE2EDuration="54.251556794s" podCreationTimestamp="2026-03-19 16:41:50 +0000 UTC" firstStartedPulling="2026-03-19 16:41:51.311499253 +0000 UTC m=+123.433698501" lastFinishedPulling="2026-03-19 16:42:43.786211448 +0000 UTC m=+175.908410716" observedRunningTime="2026-03-19 16:42:44.248053817 +0000 UTC m=+176.370253055" watchObservedRunningTime="2026-03-19 16:42:44.251556794 +0000 UTC m=+176.373756042" Mar 19 16:42:44 crc kubenswrapper[4918]: I0319 16:42:44.279273 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6hdb4" podStartSLOduration=3.6873918530000003 podStartE2EDuration="57.279254274s" podCreationTimestamp="2026-03-19 16:41:47 +0000 UTC" firstStartedPulling="2026-03-19 16:41:50.248900802 +0000 UTC m=+122.371100050" lastFinishedPulling="2026-03-19 16:42:43.840763223 +0000 UTC m=+175.962962471" observedRunningTime="2026-03-19 16:42:44.279001527 +0000 UTC m=+176.401200775" watchObservedRunningTime="2026-03-19 16:42:44.279254274 +0000 UTC m=+176.401453522" Mar 19 16:42:44 crc kubenswrapper[4918]: I0319 16:42:44.297549 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dh9m5" podStartSLOduration=2.814294266 podStartE2EDuration="56.297532391s" podCreationTimestamp="2026-03-19 16:41:48 +0000 UTC" firstStartedPulling="2026-03-19 16:41:50.231623341 +0000 UTC m=+122.353822599" lastFinishedPulling="2026-03-19 16:42:43.714861436 +0000 UTC m=+175.837060724" observedRunningTime="2026-03-19 16:42:44.296441442 +0000 UTC m=+176.418640690" watchObservedRunningTime="2026-03-19 16:42:44.297532391 +0000 UTC m=+176.419731639" Mar 19 16:42:44 crc kubenswrapper[4918]: I0319 16:42:44.318447 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5nsgc" podStartSLOduration=2.741234923 podStartE2EDuration="55.318425982s" podCreationTimestamp="2026-03-19 16:41:49 +0000 UTC" firstStartedPulling="2026-03-19 16:41:51.324646651 +0000 UTC m=+123.446845899" lastFinishedPulling="2026-03-19 16:42:43.90183771 +0000 UTC m=+176.024036958" observedRunningTime="2026-03-19 16:42:44.31439827 +0000 UTC m=+176.436597518" watchObservedRunningTime="2026-03-19 16:42:44.318425982 +0000 UTC m=+176.440625230" Mar 19 16:42:44 crc kubenswrapper[4918]: I0319 16:42:44.593843 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" path="/var/lib/kubelet/pods/42c986e2-96e9-4f2c-9b13-8cf09b8d0480/volumes" Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.050393 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s2zbj" podStartSLOduration=3.550811737 podStartE2EDuration="57.050372504s" podCreationTimestamp="2026-03-19 16:41:48 +0000 UTC" firstStartedPulling="2026-03-19 16:41:50.249328014 +0000 UTC m=+122.371527262" lastFinishedPulling="2026-03-19 16:42:43.748888781 +0000 UTC m=+175.871088029" observedRunningTime="2026-03-19 16:42:44.34137982 +0000 UTC m=+176.463579068" watchObservedRunningTime="2026-03-19 16:42:45.050372504 +0000 UTC m=+177.172571752" Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.052935 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64d975cccd-4sqkq"] Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.053222 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" podUID="56567de2-0b79-4304-95bf-57ba79683a03" containerName="controller-manager" containerID="cri-o://10d68c38a10623685f07a39b6ae755a1ca93febe6be01a4d76727db6f1282ff3" gracePeriod=30 Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.055622 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h"] Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.055856 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" podUID="9df7a3f3-81e4-4990-8e48-bee61f27c35e" containerName="route-controller-manager" containerID="cri-o://c3374f7c42008891e6de5a7503769be637630d7c3ceed6015229d62bc90fb5c6" gracePeriod=30 Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.250973 4918 generic.go:334] "Generic (PLEG): container finished" podID="9df7a3f3-81e4-4990-8e48-bee61f27c35e" containerID="c3374f7c42008891e6de5a7503769be637630d7c3ceed6015229d62bc90fb5c6" exitCode=0 Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.251058 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" event={"ID":"9df7a3f3-81e4-4990-8e48-bee61f27c35e","Type":"ContainerDied","Data":"c3374f7c42008891e6de5a7503769be637630d7c3ceed6015229d62bc90fb5c6"} Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.253718 4918 generic.go:334] "Generic (PLEG): container finished" podID="56567de2-0b79-4304-95bf-57ba79683a03" containerID="10d68c38a10623685f07a39b6ae755a1ca93febe6be01a4d76727db6f1282ff3" exitCode=0 Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.253769 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" event={"ID":"56567de2-0b79-4304-95bf-57ba79683a03","Type":"ContainerDied","Data":"10d68c38a10623685f07a39b6ae755a1ca93febe6be01a4d76727db6f1282ff3"} Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.710744 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.805720 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-config\") pod \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.805784 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9df7a3f3-81e4-4990-8e48-bee61f27c35e-serving-cert\") pod \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.805812 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-client-ca\") pod \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.805832 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlxx5\" (UniqueName: \"kubernetes.io/projected/9df7a3f3-81e4-4990-8e48-bee61f27c35e-kube-api-access-tlxx5\") pod \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\" (UID: \"9df7a3f3-81e4-4990-8e48-bee61f27c35e\") " Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.807104 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-config" (OuterVolumeSpecName: "config") pod "9df7a3f3-81e4-4990-8e48-bee61f27c35e" (UID: "9df7a3f3-81e4-4990-8e48-bee61f27c35e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.807771 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-client-ca" (OuterVolumeSpecName: "client-ca") pod "9df7a3f3-81e4-4990-8e48-bee61f27c35e" (UID: "9df7a3f3-81e4-4990-8e48-bee61f27c35e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.816326 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df7a3f3-81e4-4990-8e48-bee61f27c35e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9df7a3f3-81e4-4990-8e48-bee61f27c35e" (UID: "9df7a3f3-81e4-4990-8e48-bee61f27c35e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.816463 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9df7a3f3-81e4-4990-8e48-bee61f27c35e-kube-api-access-tlxx5" (OuterVolumeSpecName: "kube-api-access-tlxx5") pod "9df7a3f3-81e4-4990-8e48-bee61f27c35e" (UID: "9df7a3f3-81e4-4990-8e48-bee61f27c35e"). InnerVolumeSpecName "kube-api-access-tlxx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.840250 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.907465 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9df7a3f3-81e4-4990-8e48-bee61f27c35e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.907509 4918 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.907561 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlxx5\" (UniqueName: \"kubernetes.io/projected/9df7a3f3-81e4-4990-8e48-bee61f27c35e-kube-api-access-tlxx5\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:45 crc kubenswrapper[4918]: I0319 16:42:45.907572 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9df7a3f3-81e4-4990-8e48-bee61f27c35e-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.008441 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-client-ca\") pod \"56567de2-0b79-4304-95bf-57ba79683a03\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.008490 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72vzr\" (UniqueName: \"kubernetes.io/projected/56567de2-0b79-4304-95bf-57ba79683a03-kube-api-access-72vzr\") pod \"56567de2-0b79-4304-95bf-57ba79683a03\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.008531 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56567de2-0b79-4304-95bf-57ba79683a03-serving-cert\") pod \"56567de2-0b79-4304-95bf-57ba79683a03\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.008586 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-proxy-ca-bundles\") pod \"56567de2-0b79-4304-95bf-57ba79683a03\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.008606 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-config\") pod \"56567de2-0b79-4304-95bf-57ba79683a03\" (UID: \"56567de2-0b79-4304-95bf-57ba79683a03\") " Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.009423 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-client-ca" (OuterVolumeSpecName: "client-ca") pod "56567de2-0b79-4304-95bf-57ba79683a03" (UID: "56567de2-0b79-4304-95bf-57ba79683a03"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.009449 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-config" (OuterVolumeSpecName: "config") pod "56567de2-0b79-4304-95bf-57ba79683a03" (UID: "56567de2-0b79-4304-95bf-57ba79683a03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.009475 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "56567de2-0b79-4304-95bf-57ba79683a03" (UID: "56567de2-0b79-4304-95bf-57ba79683a03"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.013027 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56567de2-0b79-4304-95bf-57ba79683a03-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "56567de2-0b79-4304-95bf-57ba79683a03" (UID: "56567de2-0b79-4304-95bf-57ba79683a03"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.013088 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56567de2-0b79-4304-95bf-57ba79683a03-kube-api-access-72vzr" (OuterVolumeSpecName: "kube-api-access-72vzr") pod "56567de2-0b79-4304-95bf-57ba79683a03" (UID: "56567de2-0b79-4304-95bf-57ba79683a03"). InnerVolumeSpecName "kube-api-access-72vzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.110621 4918 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.110709 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.110723 4918 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56567de2-0b79-4304-95bf-57ba79683a03-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.110736 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72vzr\" (UniqueName: \"kubernetes.io/projected/56567de2-0b79-4304-95bf-57ba79683a03-kube-api-access-72vzr\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.110751 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56567de2-0b79-4304-95bf-57ba79683a03-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.260363 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" event={"ID":"56567de2-0b79-4304-95bf-57ba79683a03","Type":"ContainerDied","Data":"d2ede8b75284e13935d18147062aff1b887e500a04184a9ab4be5edb290a0602"} Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.261260 4918 scope.go:117] "RemoveContainer" containerID="10d68c38a10623685f07a39b6ae755a1ca93febe6be01a4d76727db6f1282ff3" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.260413 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64d975cccd-4sqkq" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.261761 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" event={"ID":"9df7a3f3-81e4-4990-8e48-bee61f27c35e","Type":"ContainerDied","Data":"fb141cea53c40fb06c54dbbb68676c4e9d200667df79ff98947ab1a16087a09c"} Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.261987 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.292892 4918 scope.go:117] "RemoveContainer" containerID="c3374f7c42008891e6de5a7503769be637630d7c3ceed6015229d62bc90fb5c6" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.306488 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h"] Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.312562 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68d4f88c86-2n62h"] Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.324913 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64d975cccd-4sqkq"] Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.330087 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64d975cccd-4sqkq"] Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.593284 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56567de2-0b79-4304-95bf-57ba79683a03" path="/var/lib/kubelet/pods/56567de2-0b79-4304-95bf-57ba79683a03/volumes" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.594191 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9df7a3f3-81e4-4990-8e48-bee61f27c35e" path="/var/lib/kubelet/pods/9df7a3f3-81e4-4990-8e48-bee61f27c35e/volumes" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.917422 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-845fd54978-sb4xw"] Mar 19 16:42:46 crc kubenswrapper[4918]: E0319 16:42:46.917678 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56567de2-0b79-4304-95bf-57ba79683a03" containerName="controller-manager" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.917693 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="56567de2-0b79-4304-95bf-57ba79683a03" containerName="controller-manager" Mar 19 16:42:46 crc kubenswrapper[4918]: E0319 16:42:46.917707 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" containerName="extract-content" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.917715 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" containerName="extract-content" Mar 19 16:42:46 crc kubenswrapper[4918]: E0319 16:42:46.917731 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" containerName="registry-server" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.917739 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" containerName="registry-server" Mar 19 16:42:46 crc kubenswrapper[4918]: E0319 16:42:46.917750 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df7a3f3-81e4-4990-8e48-bee61f27c35e" containerName="route-controller-manager" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.917758 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df7a3f3-81e4-4990-8e48-bee61f27c35e" containerName="route-controller-manager" Mar 19 16:42:46 crc kubenswrapper[4918]: E0319 16:42:46.917775 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" containerName="extract-utilities" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.917784 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" containerName="extract-utilities" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.917908 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="9df7a3f3-81e4-4990-8e48-bee61f27c35e" containerName="route-controller-manager" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.917926 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c986e2-96e9-4f2c-9b13-8cf09b8d0480" containerName="registry-server" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.917935 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="56567de2-0b79-4304-95bf-57ba79683a03" containerName="controller-manager" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.918338 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.921214 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.921611 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.921815 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.925365 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.925405 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.925623 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.935426 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.935427 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft"] Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.936218 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.938920 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-845fd54978-sb4xw"] Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.940193 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.940752 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.940904 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.941051 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.941220 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.941376 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 16:42:46 crc kubenswrapper[4918]: I0319 16:42:46.947425 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft"] Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.020627 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fzjd\" (UniqueName: \"kubernetes.io/projected/d71260c1-584f-4c13-a911-e6110f79affa-kube-api-access-4fzjd\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.020705 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-client-ca\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.020786 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-config\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.020814 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-config\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.020847 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-proxy-ca-bundles\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.020882 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d71260c1-584f-4c13-a911-e6110f79affa-serving-cert\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.020931 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfbgc\" (UniqueName: \"kubernetes.io/projected/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-kube-api-access-cfbgc\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.020974 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-serving-cert\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.021006 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-client-ca\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.122833 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d71260c1-584f-4c13-a911-e6110f79affa-serving-cert\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.122930 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfbgc\" (UniqueName: \"kubernetes.io/projected/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-kube-api-access-cfbgc\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.122982 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-serving-cert\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.123019 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-client-ca\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.123091 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fzjd\" (UniqueName: \"kubernetes.io/projected/d71260c1-584f-4c13-a911-e6110f79affa-kube-api-access-4fzjd\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.123127 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-client-ca\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.123200 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-config\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.123236 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-config\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.123269 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-proxy-ca-bundles\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.125259 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-proxy-ca-bundles\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.126314 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-client-ca\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.127253 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-client-ca\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.128623 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-config\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.130837 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-config\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.146411 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-serving-cert\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.146622 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d71260c1-584f-4c13-a911-e6110f79affa-serving-cert\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.152169 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfbgc\" (UniqueName: \"kubernetes.io/projected/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-kube-api-access-cfbgc\") pod \"route-controller-manager-6f45b94868-pxnft\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.156505 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fzjd\" (UniqueName: \"kubernetes.io/projected/d71260c1-584f-4c13-a911-e6110f79affa-kube-api-access-4fzjd\") pod \"controller-manager-845fd54978-sb4xw\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.240080 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.264334 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.508721 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-845fd54978-sb4xw"] Mar 19 16:42:47 crc kubenswrapper[4918]: I0319 16:42:47.593880 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft"] Mar 19 16:42:47 crc kubenswrapper[4918]: W0319 16:42:47.603998 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83e7cab_2b89_44a9_ba34_38a3ca882e6b.slice/crio-30aa376b7c07c3107c93cb00f3cc647779939e912d4d2a4153649d66cbb60c5f WatchSource:0}: Error finding container 30aa376b7c07c3107c93cb00f3cc647779939e912d4d2a4153649d66cbb60c5f: Status 404 returned error can't find the container with id 30aa376b7c07c3107c93cb00f3cc647779939e912d4d2a4153649d66cbb60c5f Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.304613 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" event={"ID":"c83e7cab-2b89-44a9-ba34-38a3ca882e6b","Type":"ContainerStarted","Data":"c0c3025a340d20c2d8e16293260b629442ec2a76d47dc521ddb481904508bd34"} Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.305040 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.305054 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" event={"ID":"c83e7cab-2b89-44a9-ba34-38a3ca882e6b","Type":"ContainerStarted","Data":"30aa376b7c07c3107c93cb00f3cc647779939e912d4d2a4153649d66cbb60c5f"} Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.306938 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" event={"ID":"d71260c1-584f-4c13-a911-e6110f79affa","Type":"ContainerStarted","Data":"9a03f905fbb8f505b5cb7055e4a6f7ffadfe82b2044f1c6fce20ae486abc9eac"} Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.306966 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" event={"ID":"d71260c1-584f-4c13-a911-e6110f79affa","Type":"ContainerStarted","Data":"41b56f72ef5ace52897ed71db7b37147789907942fc159bc494270a5a24a952f"} Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.307165 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.319117 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.342015 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" podStartSLOduration=3.3419939579999998 podStartE2EDuration="3.341993958s" podCreationTimestamp="2026-03-19 16:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:42:48.340825276 +0000 UTC m=+180.463024524" watchObservedRunningTime="2026-03-19 16:42:48.341993958 +0000 UTC m=+180.464193206" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.345749 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.346714 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.364144 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" podStartSLOduration=3.364114263 podStartE2EDuration="3.364114263s" podCreationTimestamp="2026-03-19 16:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:42:48.362731055 +0000 UTC m=+180.484930313" watchObservedRunningTime="2026-03-19 16:42:48.364114263 +0000 UTC m=+180.486313511" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.405837 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.488420 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.488484 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.499390 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.533790 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.971922 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:42:48 crc kubenswrapper[4918]: I0319 16:42:48.972382 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:42:49 crc kubenswrapper[4918]: I0319 16:42:49.056704 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:42:49 crc kubenswrapper[4918]: I0319 16:42:49.362961 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:42:49 crc kubenswrapper[4918]: I0319 16:42:49.363504 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:42:49 crc kubenswrapper[4918]: I0319 16:42:49.372318 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:42:50 crc kubenswrapper[4918]: I0319 16:42:50.338298 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:42:50 crc kubenswrapper[4918]: I0319 16:42:50.338420 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:42:50 crc kubenswrapper[4918]: I0319 16:42:50.403208 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:42:50 crc kubenswrapper[4918]: I0319 16:42:50.438785 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dh9m5"] Mar 19 16:42:50 crc kubenswrapper[4918]: I0319 16:42:50.654972 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:42:50 crc kubenswrapper[4918]: I0319 16:42:50.655036 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:42:50 crc kubenswrapper[4918]: I0319 16:42:50.714199 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.333684 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dh9m5" podUID="83708522-86b5-47d3-9f69-3bb7a645bb39" containerName="registry-server" containerID="cri-o://8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f" gracePeriod=2 Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.398698 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.413042 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.827144 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.880332 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.880421 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.891374 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-catalog-content\") pod \"83708522-86b5-47d3-9f69-3bb7a645bb39\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.891552 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-utilities\") pod \"83708522-86b5-47d3-9f69-3bb7a645bb39\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.891595 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnjw7\" (UniqueName: \"kubernetes.io/projected/83708522-86b5-47d3-9f69-3bb7a645bb39-kube-api-access-dnjw7\") pod \"83708522-86b5-47d3-9f69-3bb7a645bb39\" (UID: \"83708522-86b5-47d3-9f69-3bb7a645bb39\") " Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.893074 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-utilities" (OuterVolumeSpecName: "utilities") pod "83708522-86b5-47d3-9f69-3bb7a645bb39" (UID: "83708522-86b5-47d3-9f69-3bb7a645bb39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.900791 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83708522-86b5-47d3-9f69-3bb7a645bb39-kube-api-access-dnjw7" (OuterVolumeSpecName: "kube-api-access-dnjw7") pod "83708522-86b5-47d3-9f69-3bb7a645bb39" (UID: "83708522-86b5-47d3-9f69-3bb7a645bb39"). InnerVolumeSpecName "kube-api-access-dnjw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.944702 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.992428 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83708522-86b5-47d3-9f69-3bb7a645bb39" (UID: "83708522-86b5-47d3-9f69-3bb7a645bb39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.994351 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.994392 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnjw7\" (UniqueName: \"kubernetes.io/projected/83708522-86b5-47d3-9f69-3bb7a645bb39-kube-api-access-dnjw7\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:51 crc kubenswrapper[4918]: I0319 16:42:51.994406 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83708522-86b5-47d3-9f69-3bb7a645bb39-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.343027 4918 generic.go:334] "Generic (PLEG): container finished" podID="83708522-86b5-47d3-9f69-3bb7a645bb39" containerID="8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f" exitCode=0 Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.343251 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh9m5" event={"ID":"83708522-86b5-47d3-9f69-3bb7a645bb39","Type":"ContainerDied","Data":"8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f"} Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.344048 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh9m5" event={"ID":"83708522-86b5-47d3-9f69-3bb7a645bb39","Type":"ContainerDied","Data":"aec72e2612d36f04345b85c23b98810b0e27a47906429b9cf82861b6f1a3fb79"} Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.343405 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh9m5" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.344095 4918 scope.go:117] "RemoveContainer" containerID="8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.377975 4918 scope.go:117] "RemoveContainer" containerID="10ab177fc8d1880aa08bc520c78b4797f73ad5340b1682f2da3aeeabd153fbab" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.408163 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dh9m5"] Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.413838 4918 scope.go:117] "RemoveContainer" containerID="d932a36415ff851986d239d5a987296f51839e1196171437055ae52e33d17f66" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.414796 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.415516 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dh9m5"] Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.437444 4918 scope.go:117] "RemoveContainer" containerID="8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f" Mar 19 16:42:52 crc kubenswrapper[4918]: E0319 16:42:52.438458 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f\": container with ID starting with 8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f not found: ID does not exist" containerID="8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.438698 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f"} err="failed to get container status \"8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f\": rpc error: code = NotFound desc = could not find container \"8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f\": container with ID starting with 8af266683633c2471665b1381fa5d072f32440c705971843c45492245bf0b72f not found: ID does not exist" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.438894 4918 scope.go:117] "RemoveContainer" containerID="10ab177fc8d1880aa08bc520c78b4797f73ad5340b1682f2da3aeeabd153fbab" Mar 19 16:42:52 crc kubenswrapper[4918]: E0319 16:42:52.439539 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10ab177fc8d1880aa08bc520c78b4797f73ad5340b1682f2da3aeeabd153fbab\": container with ID starting with 10ab177fc8d1880aa08bc520c78b4797f73ad5340b1682f2da3aeeabd153fbab not found: ID does not exist" containerID="10ab177fc8d1880aa08bc520c78b4797f73ad5340b1682f2da3aeeabd153fbab" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.439894 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ab177fc8d1880aa08bc520c78b4797f73ad5340b1682f2da3aeeabd153fbab"} err="failed to get container status \"10ab177fc8d1880aa08bc520c78b4797f73ad5340b1682f2da3aeeabd153fbab\": rpc error: code = NotFound desc = could not find container \"10ab177fc8d1880aa08bc520c78b4797f73ad5340b1682f2da3aeeabd153fbab\": container with ID starting with 10ab177fc8d1880aa08bc520c78b4797f73ad5340b1682f2da3aeeabd153fbab not found: ID does not exist" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.442274 4918 scope.go:117] "RemoveContainer" containerID="d932a36415ff851986d239d5a987296f51839e1196171437055ae52e33d17f66" Mar 19 16:42:52 crc kubenswrapper[4918]: E0319 16:42:52.443291 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d932a36415ff851986d239d5a987296f51839e1196171437055ae52e33d17f66\": container with ID starting with d932a36415ff851986d239d5a987296f51839e1196171437055ae52e33d17f66 not found: ID does not exist" containerID="d932a36415ff851986d239d5a987296f51839e1196171437055ae52e33d17f66" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.443365 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d932a36415ff851986d239d5a987296f51839e1196171437055ae52e33d17f66"} err="failed to get container status \"d932a36415ff851986d239d5a987296f51839e1196171437055ae52e33d17f66\": rpc error: code = NotFound desc = could not find container \"d932a36415ff851986d239d5a987296f51839e1196171437055ae52e33d17f66\": container with ID starting with d932a36415ff851986d239d5a987296f51839e1196171437055ae52e33d17f66 not found: ID does not exist" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.597015 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83708522-86b5-47d3-9f69-3bb7a645bb39" path="/var/lib/kubelet/pods/83708522-86b5-47d3-9f69-3bb7a645bb39/volumes" Mar 19 16:42:52 crc kubenswrapper[4918]: I0319 16:42:52.835048 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4mc2"] Mar 19 16:42:53 crc kubenswrapper[4918]: I0319 16:42:53.354417 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t4mc2" podUID="6c06d493-b3ec-42b0-9050-48e45aa277fe" containerName="registry-server" containerID="cri-o://d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a" gracePeriod=2 Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.027958 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.130247 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5hqs\" (UniqueName: \"kubernetes.io/projected/6c06d493-b3ec-42b0-9050-48e45aa277fe-kube-api-access-p5hqs\") pod \"6c06d493-b3ec-42b0-9050-48e45aa277fe\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.130314 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-utilities\") pod \"6c06d493-b3ec-42b0-9050-48e45aa277fe\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.130382 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-catalog-content\") pod \"6c06d493-b3ec-42b0-9050-48e45aa277fe\" (UID: \"6c06d493-b3ec-42b0-9050-48e45aa277fe\") " Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.132625 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-utilities" (OuterVolumeSpecName: "utilities") pod "6c06d493-b3ec-42b0-9050-48e45aa277fe" (UID: "6c06d493-b3ec-42b0-9050-48e45aa277fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.141542 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c06d493-b3ec-42b0-9050-48e45aa277fe-kube-api-access-p5hqs" (OuterVolumeSpecName: "kube-api-access-p5hqs") pod "6c06d493-b3ec-42b0-9050-48e45aa277fe" (UID: "6c06d493-b3ec-42b0-9050-48e45aa277fe"). InnerVolumeSpecName "kube-api-access-p5hqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.156797 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c06d493-b3ec-42b0-9050-48e45aa277fe" (UID: "6c06d493-b3ec-42b0-9050-48e45aa277fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.231423 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.231464 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5hqs\" (UniqueName: \"kubernetes.io/projected/6c06d493-b3ec-42b0-9050-48e45aa277fe-kube-api-access-p5hqs\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.231478 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c06d493-b3ec-42b0-9050-48e45aa277fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.367961 4918 generic.go:334] "Generic (PLEG): container finished" podID="6c06d493-b3ec-42b0-9050-48e45aa277fe" containerID="d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a" exitCode=0 Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.368066 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4mc2" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.368058 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4mc2" event={"ID":"6c06d493-b3ec-42b0-9050-48e45aa277fe","Type":"ContainerDied","Data":"d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a"} Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.368696 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4mc2" event={"ID":"6c06d493-b3ec-42b0-9050-48e45aa277fe","Type":"ContainerDied","Data":"f5b2e5b34d3cc59bb0db0d4522700864c4d7c239252b148123c41d7255384bc0"} Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.368736 4918 scope.go:117] "RemoveContainer" containerID="d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.398732 4918 scope.go:117] "RemoveContainer" containerID="dfce7df5aecdd20f405cd6b86bf95b97a94f9ed3b611f2304e04f2212030b13f" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.425630 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4mc2"] Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.433780 4918 scope.go:117] "RemoveContainer" containerID="6ed1d94dc41467d73a557dea5235d5c282818d91996f78f7f0929aaa40d80e4b" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.437109 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4mc2"] Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.454809 4918 scope.go:117] "RemoveContainer" containerID="d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a" Mar 19 16:42:54 crc kubenswrapper[4918]: E0319 16:42:54.455343 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a\": container with ID starting with d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a not found: ID does not exist" containerID="d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.455382 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a"} err="failed to get container status \"d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a\": rpc error: code = NotFound desc = could not find container \"d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a\": container with ID starting with d21b022f218183649803dc8c29d37b520f123f1ee8b7293ad48d83c7b2a2904a not found: ID does not exist" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.455415 4918 scope.go:117] "RemoveContainer" containerID="dfce7df5aecdd20f405cd6b86bf95b97a94f9ed3b611f2304e04f2212030b13f" Mar 19 16:42:54 crc kubenswrapper[4918]: E0319 16:42:54.456142 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfce7df5aecdd20f405cd6b86bf95b97a94f9ed3b611f2304e04f2212030b13f\": container with ID starting with dfce7df5aecdd20f405cd6b86bf95b97a94f9ed3b611f2304e04f2212030b13f not found: ID does not exist" containerID="dfce7df5aecdd20f405cd6b86bf95b97a94f9ed3b611f2304e04f2212030b13f" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.456195 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfce7df5aecdd20f405cd6b86bf95b97a94f9ed3b611f2304e04f2212030b13f"} err="failed to get container status \"dfce7df5aecdd20f405cd6b86bf95b97a94f9ed3b611f2304e04f2212030b13f\": rpc error: code = NotFound desc = could not find container \"dfce7df5aecdd20f405cd6b86bf95b97a94f9ed3b611f2304e04f2212030b13f\": container with ID starting with dfce7df5aecdd20f405cd6b86bf95b97a94f9ed3b611f2304e04f2212030b13f not found: ID does not exist" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.456229 4918 scope.go:117] "RemoveContainer" containerID="6ed1d94dc41467d73a557dea5235d5c282818d91996f78f7f0929aaa40d80e4b" Mar 19 16:42:54 crc kubenswrapper[4918]: E0319 16:42:54.456731 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ed1d94dc41467d73a557dea5235d5c282818d91996f78f7f0929aaa40d80e4b\": container with ID starting with 6ed1d94dc41467d73a557dea5235d5c282818d91996f78f7f0929aaa40d80e4b not found: ID does not exist" containerID="6ed1d94dc41467d73a557dea5235d5c282818d91996f78f7f0929aaa40d80e4b" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.456784 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ed1d94dc41467d73a557dea5235d5c282818d91996f78f7f0929aaa40d80e4b"} err="failed to get container status \"6ed1d94dc41467d73a557dea5235d5c282818d91996f78f7f0929aaa40d80e4b\": rpc error: code = NotFound desc = could not find container \"6ed1d94dc41467d73a557dea5235d5c282818d91996f78f7f0929aaa40d80e4b\": container with ID starting with 6ed1d94dc41467d73a557dea5235d5c282818d91996f78f7f0929aaa40d80e4b not found: ID does not exist" Mar 19 16:42:54 crc kubenswrapper[4918]: I0319 16:42:54.602868 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c06d493-b3ec-42b0-9050-48e45aa277fe" path="/var/lib/kubelet/pods/6c06d493-b3ec-42b0-9050-48e45aa277fe/volumes" Mar 19 16:42:55 crc kubenswrapper[4918]: I0319 16:42:55.838588 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rcmw9"] Mar 19 16:42:55 crc kubenswrapper[4918]: I0319 16:42:55.839409 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rcmw9" podUID="3044e214-7f52-423c-98a6-03a05ed008a1" containerName="registry-server" containerID="cri-o://e9fe3eb56d309db503acd4817def0be0efeb8d8d1a1463333f4a86bc82b3d90c" gracePeriod=2 Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.383578 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcmw9" event={"ID":"3044e214-7f52-423c-98a6-03a05ed008a1","Type":"ContainerDied","Data":"e9fe3eb56d309db503acd4817def0be0efeb8d8d1a1463333f4a86bc82b3d90c"} Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.383579 4918 generic.go:334] "Generic (PLEG): container finished" podID="3044e214-7f52-423c-98a6-03a05ed008a1" containerID="e9fe3eb56d309db503acd4817def0be0efeb8d8d1a1463333f4a86bc82b3d90c" exitCode=0 Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.383661 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcmw9" event={"ID":"3044e214-7f52-423c-98a6-03a05ed008a1","Type":"ContainerDied","Data":"69f82f247a41372cd6a93e76e85b293b66b67bf3188b2b35325c53082118f29a"} Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.383683 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69f82f247a41372cd6a93e76e85b293b66b67bf3188b2b35325c53082118f29a" Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.404494 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.580546 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-utilities\") pod \"3044e214-7f52-423c-98a6-03a05ed008a1\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.580850 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-catalog-content\") pod \"3044e214-7f52-423c-98a6-03a05ed008a1\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.580886 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxdh8\" (UniqueName: \"kubernetes.io/projected/3044e214-7f52-423c-98a6-03a05ed008a1-kube-api-access-dxdh8\") pod \"3044e214-7f52-423c-98a6-03a05ed008a1\" (UID: \"3044e214-7f52-423c-98a6-03a05ed008a1\") " Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.581758 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-utilities" (OuterVolumeSpecName: "utilities") pod "3044e214-7f52-423c-98a6-03a05ed008a1" (UID: "3044e214-7f52-423c-98a6-03a05ed008a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.595436 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3044e214-7f52-423c-98a6-03a05ed008a1-kube-api-access-dxdh8" (OuterVolumeSpecName: "kube-api-access-dxdh8") pod "3044e214-7f52-423c-98a6-03a05ed008a1" (UID: "3044e214-7f52-423c-98a6-03a05ed008a1"). InnerVolumeSpecName "kube-api-access-dxdh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.684546 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.684592 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxdh8\" (UniqueName: \"kubernetes.io/projected/3044e214-7f52-423c-98a6-03a05ed008a1-kube-api-access-dxdh8\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.720083 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3044e214-7f52-423c-98a6-03a05ed008a1" (UID: "3044e214-7f52-423c-98a6-03a05ed008a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:42:56 crc kubenswrapper[4918]: I0319 16:42:56.786158 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3044e214-7f52-423c-98a6-03a05ed008a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:57 crc kubenswrapper[4918]: I0319 16:42:57.216729 4918 csr.go:261] certificate signing request csr-knwq9 is approved, waiting to be issued Mar 19 16:42:57 crc kubenswrapper[4918]: I0319 16:42:57.224498 4918 csr.go:257] certificate signing request csr-knwq9 is issued Mar 19 16:42:57 crc kubenswrapper[4918]: I0319 16:42:57.394937 4918 generic.go:334] "Generic (PLEG): container finished" podID="490c710f-78b8-41a4-b4bc-4eeffdde7a5d" containerID="b9476c8c22fb672b08a1def9b9d4fd0d8e6e3a1361958274a31c95abdd29b83a" exitCode=0 Mar 19 16:42:57 crc kubenswrapper[4918]: I0319 16:42:57.395054 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcmw9" Mar 19 16:42:57 crc kubenswrapper[4918]: I0319 16:42:57.400779 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565642-wjgcp" event={"ID":"490c710f-78b8-41a4-b4bc-4eeffdde7a5d","Type":"ContainerDied","Data":"b9476c8c22fb672b08a1def9b9d4fd0d8e6e3a1361958274a31c95abdd29b83a"} Mar 19 16:42:57 crc kubenswrapper[4918]: I0319 16:42:57.442924 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rcmw9"] Mar 19 16:42:57 crc kubenswrapper[4918]: I0319 16:42:57.448096 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rcmw9"] Mar 19 16:42:58 crc kubenswrapper[4918]: I0319 16:42:58.225568 4918 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-01 08:44:33.726119619 +0000 UTC Mar 19 16:42:58 crc kubenswrapper[4918]: I0319 16:42:58.225908 4918 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6160h1m35.500216427s for next certificate rotation Mar 19 16:42:58 crc kubenswrapper[4918]: I0319 16:42:58.601751 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3044e214-7f52-423c-98a6-03a05ed008a1" path="/var/lib/kubelet/pods/3044e214-7f52-423c-98a6-03a05ed008a1/volumes" Mar 19 16:42:58 crc kubenswrapper[4918]: I0319 16:42:58.753575 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565642-wjgcp" Mar 19 16:42:58 crc kubenswrapper[4918]: I0319 16:42:58.907757 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h9xcq"] Mar 19 16:42:58 crc kubenswrapper[4918]: I0319 16:42:58.914737 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q6t2\" (UniqueName: \"kubernetes.io/projected/490c710f-78b8-41a4-b4bc-4eeffdde7a5d-kube-api-access-9q6t2\") pod \"490c710f-78b8-41a4-b4bc-4eeffdde7a5d\" (UID: \"490c710f-78b8-41a4-b4bc-4eeffdde7a5d\") " Mar 19 16:42:58 crc kubenswrapper[4918]: I0319 16:42:58.923648 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490c710f-78b8-41a4-b4bc-4eeffdde7a5d-kube-api-access-9q6t2" (OuterVolumeSpecName: "kube-api-access-9q6t2") pod "490c710f-78b8-41a4-b4bc-4eeffdde7a5d" (UID: "490c710f-78b8-41a4-b4bc-4eeffdde7a5d"). InnerVolumeSpecName "kube-api-access-9q6t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:42:59 crc kubenswrapper[4918]: I0319 16:42:59.015985 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q6t2\" (UniqueName: \"kubernetes.io/projected/490c710f-78b8-41a4-b4bc-4eeffdde7a5d-kube-api-access-9q6t2\") on node \"crc\" DevicePath \"\"" Mar 19 16:42:59 crc kubenswrapper[4918]: I0319 16:42:59.226609 4918 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-02 00:50:42.315345373 +0000 UTC Mar 19 16:42:59 crc kubenswrapper[4918]: I0319 16:42:59.226650 4918 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6176h7m43.088698012s for next certificate rotation Mar 19 16:42:59 crc kubenswrapper[4918]: I0319 16:42:59.406023 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565642-wjgcp" event={"ID":"490c710f-78b8-41a4-b4bc-4eeffdde7a5d","Type":"ContainerDied","Data":"e884172454888b98c7d653deb0aa3a642ec39be0da96332842347a007bb9f79d"} Mar 19 16:42:59 crc kubenswrapper[4918]: I0319 16:42:59.406069 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e884172454888b98c7d653deb0aa3a642ec39be0da96332842347a007bb9f79d" Mar 19 16:42:59 crc kubenswrapper[4918]: I0319 16:42:59.406081 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565642-wjgcp" Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.034553 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-845fd54978-sb4xw"] Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.035389 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" podUID="d71260c1-584f-4c13-a911-e6110f79affa" containerName="controller-manager" containerID="cri-o://9a03f905fbb8f505b5cb7055e4a6f7ffadfe82b2044f1c6fce20ae486abc9eac" gracePeriod=30 Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.109548 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft"] Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.109756 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" podUID="c83e7cab-2b89-44a9-ba34-38a3ca882e6b" containerName="route-controller-manager" containerID="cri-o://c0c3025a340d20c2d8e16293260b629442ec2a76d47dc521ddb481904508bd34" gracePeriod=30 Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.441215 4918 generic.go:334] "Generic (PLEG): container finished" podID="c83e7cab-2b89-44a9-ba34-38a3ca882e6b" containerID="c0c3025a340d20c2d8e16293260b629442ec2a76d47dc521ddb481904508bd34" exitCode=0 Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.441315 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" event={"ID":"c83e7cab-2b89-44a9-ba34-38a3ca882e6b","Type":"ContainerDied","Data":"c0c3025a340d20c2d8e16293260b629442ec2a76d47dc521ddb481904508bd34"} Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.443659 4918 generic.go:334] "Generic (PLEG): container finished" podID="d71260c1-584f-4c13-a911-e6110f79affa" containerID="9a03f905fbb8f505b5cb7055e4a6f7ffadfe82b2044f1c6fce20ae486abc9eac" exitCode=0 Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.443717 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" event={"ID":"d71260c1-584f-4c13-a911-e6110f79affa","Type":"ContainerDied","Data":"9a03f905fbb8f505b5cb7055e4a6f7ffadfe82b2044f1c6fce20ae486abc9eac"} Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.548498 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.700555 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-serving-cert\") pod \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.700628 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbgc\" (UniqueName: \"kubernetes.io/projected/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-kube-api-access-cfbgc\") pod \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.700663 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-client-ca\") pod \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.700690 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-config\") pod \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\" (UID: \"c83e7cab-2b89-44a9-ba34-38a3ca882e6b\") " Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.701435 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-config" (OuterVolumeSpecName: "config") pod "c83e7cab-2b89-44a9-ba34-38a3ca882e6b" (UID: "c83e7cab-2b89-44a9-ba34-38a3ca882e6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.701714 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-client-ca" (OuterVolumeSpecName: "client-ca") pod "c83e7cab-2b89-44a9-ba34-38a3ca882e6b" (UID: "c83e7cab-2b89-44a9-ba34-38a3ca882e6b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.709094 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c83e7cab-2b89-44a9-ba34-38a3ca882e6b" (UID: "c83e7cab-2b89-44a9-ba34-38a3ca882e6b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.710174 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-kube-api-access-cfbgc" (OuterVolumeSpecName: "kube-api-access-cfbgc") pod "c83e7cab-2b89-44a9-ba34-38a3ca882e6b" (UID: "c83e7cab-2b89-44a9-ba34-38a3ca882e6b"). InnerVolumeSpecName "kube-api-access-cfbgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.801762 4918 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.802155 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.802174 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:05 crc kubenswrapper[4918]: I0319 16:43:05.802197 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbgc\" (UniqueName: \"kubernetes.io/projected/c83e7cab-2b89-44a9-ba34-38a3ca882e6b-kube-api-access-cfbgc\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.104467 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.206023 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-proxy-ca-bundles\") pod \"d71260c1-584f-4c13-a911-e6110f79affa\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.206079 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fzjd\" (UniqueName: \"kubernetes.io/projected/d71260c1-584f-4c13-a911-e6110f79affa-kube-api-access-4fzjd\") pod \"d71260c1-584f-4c13-a911-e6110f79affa\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.206108 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-config\") pod \"d71260c1-584f-4c13-a911-e6110f79affa\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.206147 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-client-ca\") pod \"d71260c1-584f-4c13-a911-e6110f79affa\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.206167 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d71260c1-584f-4c13-a911-e6110f79affa-serving-cert\") pod \"d71260c1-584f-4c13-a911-e6110f79affa\" (UID: \"d71260c1-584f-4c13-a911-e6110f79affa\") " Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.206949 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d71260c1-584f-4c13-a911-e6110f79affa" (UID: "d71260c1-584f-4c13-a911-e6110f79affa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.206975 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-client-ca" (OuterVolumeSpecName: "client-ca") pod "d71260c1-584f-4c13-a911-e6110f79affa" (UID: "d71260c1-584f-4c13-a911-e6110f79affa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.207757 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-config" (OuterVolumeSpecName: "config") pod "d71260c1-584f-4c13-a911-e6110f79affa" (UID: "d71260c1-584f-4c13-a911-e6110f79affa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.213132 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71260c1-584f-4c13-a911-e6110f79affa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d71260c1-584f-4c13-a911-e6110f79affa" (UID: "d71260c1-584f-4c13-a911-e6110f79affa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.213154 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71260c1-584f-4c13-a911-e6110f79affa-kube-api-access-4fzjd" (OuterVolumeSpecName: "kube-api-access-4fzjd") pod "d71260c1-584f-4c13-a911-e6110f79affa" (UID: "d71260c1-584f-4c13-a911-e6110f79affa"). InnerVolumeSpecName "kube-api-access-4fzjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.307708 4918 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.307754 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fzjd\" (UniqueName: \"kubernetes.io/projected/d71260c1-584f-4c13-a911-e6110f79affa-kube-api-access-4fzjd\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.307773 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.307783 4918 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d71260c1-584f-4c13-a911-e6110f79affa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.307792 4918 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d71260c1-584f-4c13-a911-e6110f79affa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.453174 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" event={"ID":"c83e7cab-2b89-44a9-ba34-38a3ca882e6b","Type":"ContainerDied","Data":"30aa376b7c07c3107c93cb00f3cc647779939e912d4d2a4153649d66cbb60c5f"} Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.453299 4918 scope.go:117] "RemoveContainer" containerID="c0c3025a340d20c2d8e16293260b629442ec2a76d47dc521ddb481904508bd34" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.453242 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.455174 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" event={"ID":"d71260c1-584f-4c13-a911-e6110f79affa","Type":"ContainerDied","Data":"41b56f72ef5ace52897ed71db7b37147789907942fc159bc494270a5a24a952f"} Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.455312 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845fd54978-sb4xw" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.488619 4918 scope.go:117] "RemoveContainer" containerID="9a03f905fbb8f505b5cb7055e4a6f7ffadfe82b2044f1c6fce20ae486abc9eac" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.513547 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft"] Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.520604 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f45b94868-pxnft"] Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.540560 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-845fd54978-sb4xw"] Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.544885 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-845fd54978-sb4xw"] Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.595608 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83e7cab-2b89-44a9-ba34-38a3ca882e6b" path="/var/lib/kubelet/pods/c83e7cab-2b89-44a9-ba34-38a3ca882e6b/volumes" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.596711 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71260c1-584f-4c13-a911-e6110f79affa" path="/var/lib/kubelet/pods/d71260c1-584f-4c13-a911-e6110f79affa/volumes" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928242 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-94dccb7bb-xqp6f"] Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928707 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83708522-86b5-47d3-9f69-3bb7a645bb39" containerName="extract-utilities" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928744 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="83708522-86b5-47d3-9f69-3bb7a645bb39" containerName="extract-utilities" Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928765 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490c710f-78b8-41a4-b4bc-4eeffdde7a5d" containerName="oc" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928780 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="490c710f-78b8-41a4-b4bc-4eeffdde7a5d" containerName="oc" Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928796 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71260c1-584f-4c13-a911-e6110f79affa" containerName="controller-manager" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928807 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71260c1-584f-4c13-a911-e6110f79affa" containerName="controller-manager" Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928829 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3044e214-7f52-423c-98a6-03a05ed008a1" containerName="registry-server" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928837 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3044e214-7f52-423c-98a6-03a05ed008a1" containerName="registry-server" Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928846 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83708522-86b5-47d3-9f69-3bb7a645bb39" containerName="extract-content" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928854 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="83708522-86b5-47d3-9f69-3bb7a645bb39" containerName="extract-content" Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928866 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c06d493-b3ec-42b0-9050-48e45aa277fe" containerName="extract-utilities" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928874 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c06d493-b3ec-42b0-9050-48e45aa277fe" containerName="extract-utilities" Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928887 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3044e214-7f52-423c-98a6-03a05ed008a1" containerName="extract-content" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928896 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3044e214-7f52-423c-98a6-03a05ed008a1" containerName="extract-content" Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928906 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c06d493-b3ec-42b0-9050-48e45aa277fe" containerName="extract-content" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928914 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c06d493-b3ec-42b0-9050-48e45aa277fe" containerName="extract-content" Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928927 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3044e214-7f52-423c-98a6-03a05ed008a1" containerName="extract-utilities" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928936 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3044e214-7f52-423c-98a6-03a05ed008a1" containerName="extract-utilities" Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928945 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83708522-86b5-47d3-9f69-3bb7a645bb39" containerName="registry-server" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928953 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="83708522-86b5-47d3-9f69-3bb7a645bb39" containerName="registry-server" Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928967 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c06d493-b3ec-42b0-9050-48e45aa277fe" containerName="registry-server" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928977 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c06d493-b3ec-42b0-9050-48e45aa277fe" containerName="registry-server" Mar 19 16:43:06 crc kubenswrapper[4918]: E0319 16:43:06.928990 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83e7cab-2b89-44a9-ba34-38a3ca882e6b" containerName="route-controller-manager" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.928999 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83e7cab-2b89-44a9-ba34-38a3ca882e6b" containerName="route-controller-manager" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.929137 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71260c1-584f-4c13-a911-e6110f79affa" containerName="controller-manager" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.929151 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="3044e214-7f52-423c-98a6-03a05ed008a1" containerName="registry-server" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.929164 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83e7cab-2b89-44a9-ba34-38a3ca882e6b" containerName="route-controller-manager" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.929176 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c06d493-b3ec-42b0-9050-48e45aa277fe" containerName="registry-server" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.929186 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="83708522-86b5-47d3-9f69-3bb7a645bb39" containerName="registry-server" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.929196 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="490c710f-78b8-41a4-b4bc-4eeffdde7a5d" containerName="oc" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.929781 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.932622 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.932657 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.932860 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.933132 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.933512 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.934695 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.936250 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864df45cb-mps24"] Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.937452 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.948334 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.948498 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.948756 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.948871 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.948878 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.949166 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.956860 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-94dccb7bb-xqp6f"] Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.964465 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864df45cb-mps24"] Mar 19 16:43:06 crc kubenswrapper[4918]: I0319 16:43:06.965555 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.014771 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzxg8\" (UniqueName: \"kubernetes.io/projected/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-kube-api-access-xzxg8\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.014810 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-config\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.014961 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-proxy-ca-bundles\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.014992 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-client-ca\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.015008 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-serving-cert\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.116155 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-config\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.116268 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzxg8\" (UniqueName: \"kubernetes.io/projected/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-kube-api-access-xzxg8\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.116315 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-serving-cert\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.116404 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-client-ca\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.116729 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5kqn\" (UniqueName: \"kubernetes.io/projected/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-kube-api-access-g5kqn\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.116836 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-proxy-ca-bundles\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.116973 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-client-ca\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.117028 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-serving-cert\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.117144 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-config\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.117446 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-config\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.118563 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-client-ca\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.119005 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-proxy-ca-bundles\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.124608 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-serving-cert\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.142511 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzxg8\" (UniqueName: \"kubernetes.io/projected/a6ead9a1-5028-44ca-bd26-ecfa424c7ddb-kube-api-access-xzxg8\") pod \"controller-manager-94dccb7bb-xqp6f\" (UID: \"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb\") " pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.218645 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5kqn\" (UniqueName: \"kubernetes.io/projected/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-kube-api-access-g5kqn\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.218725 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-config\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.218762 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-serving-cert\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.218813 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-client-ca\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.219859 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-client-ca\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.220305 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-config\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.224749 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-serving-cert\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.242048 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5kqn\" (UniqueName: \"kubernetes.io/projected/69bdd5b0-6ace-4e91-bd4a-d6935b942f64-kube-api-access-g5kqn\") pod \"route-controller-manager-864df45cb-mps24\" (UID: \"69bdd5b0-6ace-4e91-bd4a-d6935b942f64\") " pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.252018 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.262479 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.577310 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-864df45cb-mps24"] Mar 19 16:43:07 crc kubenswrapper[4918]: I0319 16:43:07.709184 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-94dccb7bb-xqp6f"] Mar 19 16:43:08 crc kubenswrapper[4918]: I0319 16:43:08.471261 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" event={"ID":"69bdd5b0-6ace-4e91-bd4a-d6935b942f64","Type":"ContainerStarted","Data":"518da8f7733910d4d387570d3e2878bbb1d310c49b295cd8caf91cc289732179"} Mar 19 16:43:08 crc kubenswrapper[4918]: I0319 16:43:08.471665 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:08 crc kubenswrapper[4918]: I0319 16:43:08.471681 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" event={"ID":"69bdd5b0-6ace-4e91-bd4a-d6935b942f64","Type":"ContainerStarted","Data":"7e9b0eea776654613a423f07aa648f865e22e2421e616cb4d155d0268eb67808"} Mar 19 16:43:08 crc kubenswrapper[4918]: I0319 16:43:08.473070 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" event={"ID":"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb","Type":"ContainerStarted","Data":"3638e09db74c62153c5a6ebbce7e4a304d56c628ddab437da6cd60e39a9148af"} Mar 19 16:43:08 crc kubenswrapper[4918]: I0319 16:43:08.473100 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" event={"ID":"a6ead9a1-5028-44ca-bd26-ecfa424c7ddb","Type":"ContainerStarted","Data":"da36d927d7dc1d9c41058f1db814a190a8d3c07b37aae184a45f0d0e9eb99bd9"} Mar 19 16:43:08 crc kubenswrapper[4918]: I0319 16:43:08.473401 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:08 crc kubenswrapper[4918]: I0319 16:43:08.477834 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" Mar 19 16:43:08 crc kubenswrapper[4918]: I0319 16:43:08.479604 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" Mar 19 16:43:08 crc kubenswrapper[4918]: I0319 16:43:08.494952 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-864df45cb-mps24" podStartSLOduration=3.494923898 podStartE2EDuration="3.494923898s" podCreationTimestamp="2026-03-19 16:43:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:43:08.488110878 +0000 UTC m=+200.610310146" watchObservedRunningTime="2026-03-19 16:43:08.494923898 +0000 UTC m=+200.617123186" Mar 19 16:43:08 crc kubenswrapper[4918]: I0319 16:43:08.543853 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-94dccb7bb-xqp6f" podStartSLOduration=3.5438290329999997 podStartE2EDuration="3.543829033s" podCreationTimestamp="2026-03-19 16:43:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:43:08.513993785 +0000 UTC m=+200.636193043" watchObservedRunningTime="2026-03-19 16:43:08.543829033 +0000 UTC m=+200.666028301" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.811580 4918 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.813171 4918 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.813361 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.813625 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94" gracePeriod=15 Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.813745 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82" gracePeriod=15 Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.813797 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0" gracePeriod=15 Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.813740 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d" gracePeriod=15 Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814017 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04" gracePeriod=15 Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814274 4918 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 16:43:19 crc kubenswrapper[4918]: E0319 16:43:19.814557 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814577 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 16:43:19 crc kubenswrapper[4918]: E0319 16:43:19.814590 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814600 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 16:43:19 crc kubenswrapper[4918]: E0319 16:43:19.814612 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814621 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 16:43:19 crc kubenswrapper[4918]: E0319 16:43:19.814633 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814640 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: E0319 16:43:19.814650 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814657 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 16:43:19 crc kubenswrapper[4918]: E0319 16:43:19.814672 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814680 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: E0319 16:43:19.814689 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814696 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: E0319 16:43:19.814707 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814714 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: E0319 16:43:19.814729 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814737 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814866 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814881 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814889 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814900 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814912 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814920 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.814931 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 16:43:19 crc kubenswrapper[4918]: E0319 16:43:19.815046 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.815055 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.815178 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.815187 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.913252 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.913344 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.913582 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.913707 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.913752 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.913772 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.913831 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:19 crc kubenswrapper[4918]: I0319 16:43:19.913932 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.014730 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.014859 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.014877 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.014913 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.014955 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.014958 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.015014 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.015045 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.015054 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.015075 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.015018 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.015125 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.015166 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.015207 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.015239 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.015207 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.559846 4918 generic.go:334] "Generic (PLEG): container finished" podID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" containerID="1154601c65417a9009d202880635f2af6174b1ac93dba6f8403412c41c0a7800" exitCode=0 Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.560472 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1a36234d-cdf5-47a0-a4dd-405a166c6ff7","Type":"ContainerDied","Data":"1154601c65417a9009d202880635f2af6174b1ac93dba6f8403412c41c0a7800"} Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.562363 4918 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.562932 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.566668 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.568604 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.569307 4918 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04" exitCode=0 Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.569333 4918 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82" exitCode=0 Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.569348 4918 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d" exitCode=0 Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.569359 4918 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0" exitCode=2 Mar 19 16:43:20 crc kubenswrapper[4918]: I0319 16:43:20.569414 4918 scope.go:117] "RemoveContainer" containerID="f52a359a1ac292a614b20c79a490412ec1b7e37ecf7dfc7576babdc09dfe0ea2" Mar 19 16:43:21 crc kubenswrapper[4918]: I0319 16:43:21.580106 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.092083 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.092851 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.186781 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.188136 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.188843 4918 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.189562 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.241851 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kubelet-dir\") pod \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.242028 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kube-api-access\") pod \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.242226 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-var-lock\") pod \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\" (UID: \"1a36234d-cdf5-47a0-a4dd-405a166c6ff7\") " Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.242023 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1a36234d-cdf5-47a0-a4dd-405a166c6ff7" (UID: "1a36234d-cdf5-47a0-a4dd-405a166c6ff7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.242433 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-var-lock" (OuterVolumeSpecName: "var-lock") pod "1a36234d-cdf5-47a0-a4dd-405a166c6ff7" (UID: "1a36234d-cdf5-47a0-a4dd-405a166c6ff7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.242876 4918 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.242963 4918 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.251273 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1a36234d-cdf5-47a0-a4dd-405a166c6ff7" (UID: "1a36234d-cdf5-47a0-a4dd-405a166c6ff7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.344133 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.344241 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.344242 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.344294 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.344324 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.344400 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.344716 4918 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.344742 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a36234d-cdf5-47a0-a4dd-405a166c6ff7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.344763 4918 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.344782 4918 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.591623 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.596988 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.598194 4918 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94" exitCode=0 Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.599829 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.601490 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.603938 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1a36234d-cdf5-47a0-a4dd-405a166c6ff7","Type":"ContainerDied","Data":"092c0ef0c5b8f14701b59133032a8f357e03efc91aa1af1979ebcb59b799b326"} Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.604026 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="092c0ef0c5b8f14701b59133032a8f357e03efc91aa1af1979ebcb59b799b326" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.604169 4918 scope.go:117] "RemoveContainer" containerID="45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.616206 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.632911 4918 scope.go:117] "RemoveContainer" containerID="f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.638076 4918 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.639028 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.654557 4918 scope.go:117] "RemoveContainer" containerID="dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.676286 4918 scope.go:117] "RemoveContainer" containerID="68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.694313 4918 scope.go:117] "RemoveContainer" containerID="ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.712933 4918 scope.go:117] "RemoveContainer" containerID="ae959ea338a301713af4344c7eef48f5d2562204a83bce2ffee8e51c6bcda4cd" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.740821 4918 scope.go:117] "RemoveContainer" containerID="45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04" Mar 19 16:43:22 crc kubenswrapper[4918]: E0319 16:43:22.741443 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04\": container with ID starting with 45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04 not found: ID does not exist" containerID="45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.741583 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04"} err="failed to get container status \"45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04\": rpc error: code = NotFound desc = could not find container \"45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04\": container with ID starting with 45d4d59ce45ccdd76b6c03a4ddc782404f7593fba1bc4b7af195267bd8333d04 not found: ID does not exist" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.741652 4918 scope.go:117] "RemoveContainer" containerID="f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82" Mar 19 16:43:22 crc kubenswrapper[4918]: E0319 16:43:22.742172 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82\": container with ID starting with f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82 not found: ID does not exist" containerID="f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.742248 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82"} err="failed to get container status \"f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82\": rpc error: code = NotFound desc = could not find container \"f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82\": container with ID starting with f77c346ac954c08fd292e1815f840e0630ee7300fac9018fea8edaed5958ba82 not found: ID does not exist" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.742302 4918 scope.go:117] "RemoveContainer" containerID="dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d" Mar 19 16:43:22 crc kubenswrapper[4918]: E0319 16:43:22.742909 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d\": container with ID starting with dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d not found: ID does not exist" containerID="dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.742983 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d"} err="failed to get container status \"dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d\": rpc error: code = NotFound desc = could not find container \"dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d\": container with ID starting with dc8bbba866da83ed86b3064f76103ffe9e2795e86a4bfbed3d9fbde9f644418d not found: ID does not exist" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.743028 4918 scope.go:117] "RemoveContainer" containerID="68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0" Mar 19 16:43:22 crc kubenswrapper[4918]: E0319 16:43:22.743648 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0\": container with ID starting with 68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0 not found: ID does not exist" containerID="68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.743689 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0"} err="failed to get container status \"68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0\": rpc error: code = NotFound desc = could not find container \"68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0\": container with ID starting with 68930b0ae7ef86c98b102ad42bd61b117d8d7ca5126b07f860d59bccf76959a0 not found: ID does not exist" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.743714 4918 scope.go:117] "RemoveContainer" containerID="ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94" Mar 19 16:43:22 crc kubenswrapper[4918]: E0319 16:43:22.744086 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94\": container with ID starting with ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94 not found: ID does not exist" containerID="ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.744154 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94"} err="failed to get container status \"ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94\": rpc error: code = NotFound desc = could not find container \"ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94\": container with ID starting with ae177b68caa2c1b58058321f0c704e0fb9b1effa2a9a21e9e95eadbbe7f94c94 not found: ID does not exist" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.744199 4918 scope.go:117] "RemoveContainer" containerID="ae959ea338a301713af4344c7eef48f5d2562204a83bce2ffee8e51c6bcda4cd" Mar 19 16:43:22 crc kubenswrapper[4918]: E0319 16:43:22.744663 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae959ea338a301713af4344c7eef48f5d2562204a83bce2ffee8e51c6bcda4cd\": container with ID starting with ae959ea338a301713af4344c7eef48f5d2562204a83bce2ffee8e51c6bcda4cd not found: ID does not exist" containerID="ae959ea338a301713af4344c7eef48f5d2562204a83bce2ffee8e51c6bcda4cd" Mar 19 16:43:22 crc kubenswrapper[4918]: I0319 16:43:22.744697 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae959ea338a301713af4344c7eef48f5d2562204a83bce2ffee8e51c6bcda4cd"} err="failed to get container status \"ae959ea338a301713af4344c7eef48f5d2562204a83bce2ffee8e51c6bcda4cd\": rpc error: code = NotFound desc = could not find container \"ae959ea338a301713af4344c7eef48f5d2562204a83bce2ffee8e51c6bcda4cd\": container with ID starting with ae959ea338a301713af4344c7eef48f5d2562204a83bce2ffee8e51c6bcda4cd not found: ID does not exist" Mar 19 16:43:23 crc kubenswrapper[4918]: I0319 16:43:23.929493 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" containerName="oauth-openshift" containerID="cri-o://bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f" gracePeriod=15 Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.486658 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.487861 4918 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.488270 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.488748 4918 status_manager.go:851] "Failed to get status for pod" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h9xcq\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.614482 4918 generic.go:334] "Generic (PLEG): container finished" podID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" containerID="bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f" exitCode=0 Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.614602 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.614588 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" event={"ID":"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9","Type":"ContainerDied","Data":"bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f"} Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.614675 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" event={"ID":"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9","Type":"ContainerDied","Data":"4a6b0ddf1a2044e78eb49272d4c1072f0e76af7a65b29d51bbfce6532c965fba"} Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.614700 4918 scope.go:117] "RemoveContainer" containerID="bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.615189 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.615580 4918 status_manager.go:851] "Failed to get status for pod" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h9xcq\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.643379 4918 scope.go:117] "RemoveContainer" containerID="bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f" Mar 19 16:43:24 crc kubenswrapper[4918]: E0319 16:43:24.643885 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f\": container with ID starting with bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f not found: ID does not exist" containerID="bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.643921 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f"} err="failed to get container status \"bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f\": rpc error: code = NotFound desc = could not find container \"bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f\": container with ID starting with bc165c90848f03f70dbd0cb1781fa0c040a267febffa2e4bf4ccc8b06c69470f not found: ID does not exist" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.675478 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-serving-cert\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.675565 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-idp-0-file-data\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.675598 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-provider-selection\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.675671 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-login\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.675698 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-session\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.675725 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-dir\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.675756 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-cliconfig\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.675839 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.675884 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-policies\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.676012 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-ocp-branding-template\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.676060 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-trusted-ca-bundle\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.676094 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-service-ca\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.676198 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-error\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.676237 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bh8d\" (UniqueName: \"kubernetes.io/projected/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-kube-api-access-7bh8d\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.676288 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-router-certs\") pod \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\" (UID: \"4f3fcd84-785e-4d3f-9911-1d49a4b33dc9\") " Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.677096 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.677783 4918 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.677814 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.678234 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.678413 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.679302 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.682848 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.683786 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.684195 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.684230 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-kube-api-access-7bh8d" (OuterVolumeSpecName: "kube-api-access-7bh8d") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "kube-api-access-7bh8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.684344 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.684617 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.684910 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.685026 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.686693 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" (UID: "4f3fcd84-785e-4d3f-9911-1d49a4b33dc9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779116 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779175 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bh8d\" (UniqueName: \"kubernetes.io/projected/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-kube-api-access-7bh8d\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779197 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779219 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779240 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779262 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779283 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779303 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779323 4918 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779349 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779376 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.779403 4918 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:43:24 crc kubenswrapper[4918]: E0319 16:43:24.849188 4918 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.142:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.849796 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:24 crc kubenswrapper[4918]: W0319 16:43:24.892950 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-f1be49284177000406b073d653b8c71a331f5e5b31885d18ac689948c7c520af WatchSource:0}: Error finding container f1be49284177000406b073d653b8c71a331f5e5b31885d18ac689948c7c520af: Status 404 returned error can't find the container with id f1be49284177000406b073d653b8c71a331f5e5b31885d18ac689948c7c520af Mar 19 16:43:24 crc kubenswrapper[4918]: E0319 16:43:24.896802 4918 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.142:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e4bc32e31a42f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:43:24.896265263 +0000 UTC m=+217.018464522,LastTimestamp:2026-03-19 16:43:24.896265263 +0000 UTC m=+217.018464522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.932409 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:24 crc kubenswrapper[4918]: I0319 16:43:24.932823 4918 status_manager.go:851] "Failed to get status for pod" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h9xcq\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:25 crc kubenswrapper[4918]: E0319 16:43:25.189667 4918 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.142:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e4bc32e31a42f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 16:43:24.896265263 +0000 UTC m=+217.018464522,LastTimestamp:2026-03-19 16:43:24.896265263 +0000 UTC m=+217.018464522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 16:43:25 crc kubenswrapper[4918]: I0319 16:43:25.623068 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"89be1a8f90e5bb284cac02772f9691efe50cd2a5c83e467350c3962c98f6abf7"} Mar 19 16:43:25 crc kubenswrapper[4918]: I0319 16:43:25.623116 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f1be49284177000406b073d653b8c71a331f5e5b31885d18ac689948c7c520af"} Mar 19 16:43:25 crc kubenswrapper[4918]: E0319 16:43:25.623758 4918 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.142:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:43:25 crc kubenswrapper[4918]: I0319 16:43:25.623859 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:25 crc kubenswrapper[4918]: I0319 16:43:25.624284 4918 status_manager.go:851] "Failed to get status for pod" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h9xcq\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:28 crc kubenswrapper[4918]: I0319 16:43:28.212043 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:43:28 crc kubenswrapper[4918]: I0319 16:43:28.212458 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:43:28 crc kubenswrapper[4918]: I0319 16:43:28.592881 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:28 crc kubenswrapper[4918]: I0319 16:43:28.593411 4918 status_manager.go:851] "Failed to get status for pod" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h9xcq\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:28 crc kubenswrapper[4918]: E0319 16:43:28.811009 4918 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:28 crc kubenswrapper[4918]: E0319 16:43:28.811712 4918 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:28 crc kubenswrapper[4918]: E0319 16:43:28.812284 4918 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:28 crc kubenswrapper[4918]: E0319 16:43:28.812668 4918 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:28 crc kubenswrapper[4918]: E0319 16:43:28.813155 4918 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:28 crc kubenswrapper[4918]: I0319 16:43:28.813212 4918 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 16:43:28 crc kubenswrapper[4918]: E0319 16:43:28.813713 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="200ms" Mar 19 16:43:29 crc kubenswrapper[4918]: E0319 16:43:29.015192 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="400ms" Mar 19 16:43:29 crc kubenswrapper[4918]: E0319 16:43:29.416773 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="800ms" Mar 19 16:43:29 crc kubenswrapper[4918]: E0319 16:43:29.485467 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:43:29Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:43:29Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:43:29Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T16:43:29Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2fd3c01420dada0aac3ddcf5f3e15bb4f77216eb1b18b7543cb5c955674faac6\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:b2e45bfeec42763e914803d0552e0f83028f1caf63487926ea163bb344ae59a4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746814424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:513006150b0ffeb5c18c2d7670d536b166fc89cc16dfe913a737c16c2912740d\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7703ae5aafba3e668fe9da8330f3587d12399ee573eb0bb81ad12b828a3f94b4\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252054743},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3a0729ac2d723d4dfbe8dab8121792a6f3caebacff42048e4ed85dd2ae1ca741\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d1d373f7f344e2d85cc27acf105a4ab3f429077302678e1323eef34664302ac7\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223675094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:29 crc kubenswrapper[4918]: E0319 16:43:29.486384 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:29 crc kubenswrapper[4918]: E0319 16:43:29.486824 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:29 crc kubenswrapper[4918]: E0319 16:43:29.487146 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:29 crc kubenswrapper[4918]: E0319 16:43:29.487679 4918 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:29 crc kubenswrapper[4918]: E0319 16:43:29.487730 4918 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 16:43:30 crc kubenswrapper[4918]: E0319 16:43:30.218167 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="1.6s" Mar 19 16:43:31 crc kubenswrapper[4918]: I0319 16:43:31.585747 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:31 crc kubenswrapper[4918]: I0319 16:43:31.587655 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:31 crc kubenswrapper[4918]: I0319 16:43:31.588160 4918 status_manager.go:851] "Failed to get status for pod" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h9xcq\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:31 crc kubenswrapper[4918]: I0319 16:43:31.613241 4918 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:43:31 crc kubenswrapper[4918]: I0319 16:43:31.613306 4918 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:43:31 crc kubenswrapper[4918]: E0319 16:43:31.614023 4918 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:31 crc kubenswrapper[4918]: I0319 16:43:31.614850 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:31 crc kubenswrapper[4918]: W0319 16:43:31.650578 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-39b9c80dc1291718683ddda298322001db6c73d3726e18333f45ec79b0680fdd WatchSource:0}: Error finding container 39b9c80dc1291718683ddda298322001db6c73d3726e18333f45ec79b0680fdd: Status 404 returned error can't find the container with id 39b9c80dc1291718683ddda298322001db6c73d3726e18333f45ec79b0680fdd Mar 19 16:43:31 crc kubenswrapper[4918]: I0319 16:43:31.671183 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"39b9c80dc1291718683ddda298322001db6c73d3726e18333f45ec79b0680fdd"} Mar 19 16:43:31 crc kubenswrapper[4918]: E0319 16:43:31.819596 4918 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="3.2s" Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.682673 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.684955 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.685025 4918 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="2abd1b66985995e1a44fc90e80d1a0acca10e7d483e7aca71531747026fb6a2e" exitCode=1 Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.685119 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"2abd1b66985995e1a44fc90e80d1a0acca10e7d483e7aca71531747026fb6a2e"} Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.686000 4918 scope.go:117] "RemoveContainer" containerID="2abd1b66985995e1a44fc90e80d1a0acca10e7d483e7aca71531747026fb6a2e" Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.686659 4918 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.688282 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.688895 4918 status_manager.go:851] "Failed to get status for pod" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h9xcq\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.688962 4918 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b5c1cc4706d1505d60db26d052fa02bd4995a71bc7994017cc7915ed83d8e744" exitCode=0 Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.689038 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"b5c1cc4706d1505d60db26d052fa02bd4995a71bc7994017cc7915ed83d8e744"} Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.689373 4918 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.689408 4918 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.689760 4918 status_manager.go:851] "Failed to get status for pod" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:32 crc kubenswrapper[4918]: E0319 16:43:32.689809 4918 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.690582 4918 status_manager.go:851] "Failed to get status for pod" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" pod="openshift-authentication/oauth-openshift-558db77b4-h9xcq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h9xcq\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:32 crc kubenswrapper[4918]: I0319 16:43:32.691189 4918 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Mar 19 16:43:33 crc kubenswrapper[4918]: I0319 16:43:33.698330 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 16:43:33 crc kubenswrapper[4918]: I0319 16:43:33.700033 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 16:43:33 crc kubenswrapper[4918]: I0319 16:43:33.700092 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6b0a26f75a92413f8e97e03117bd9d956de1dc08120e601c9a961b0a638305b6"} Mar 19 16:43:33 crc kubenswrapper[4918]: I0319 16:43:33.705201 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"95ed563c138644b6e9a1eb69cf110f9ff8e8ba675954074b6bebabf7f27fea82"} Mar 19 16:43:33 crc kubenswrapper[4918]: I0319 16:43:33.705227 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"25b84cd7723f0093d7fdc478210969f497fb23858e6bfc5b6175e841b9737e4f"} Mar 19 16:43:33 crc kubenswrapper[4918]: I0319 16:43:33.705235 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"93c3ab61593c159ad7d799ce20bbc4a71e9e22e6a9492566dfaba57bb3d18f63"} Mar 19 16:43:34 crc kubenswrapper[4918]: I0319 16:43:34.713284 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"16a095396e48a3d28e7dd82d8f27efa068c17680b3ff91d94425e76af5e38627"} Mar 19 16:43:34 crc kubenswrapper[4918]: I0319 16:43:34.713336 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b769462f4473c3f5d05a83fc015b9811cdfa58957e0eb12d4d92593535aa0616"} Mar 19 16:43:34 crc kubenswrapper[4918]: I0319 16:43:34.713463 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:34 crc kubenswrapper[4918]: I0319 16:43:34.713559 4918 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:43:34 crc kubenswrapper[4918]: I0319 16:43:34.713584 4918 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:43:36 crc kubenswrapper[4918]: I0319 16:43:36.615757 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:36 crc kubenswrapper[4918]: I0319 16:43:36.615824 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:36 crc kubenswrapper[4918]: I0319 16:43:36.624066 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:38 crc kubenswrapper[4918]: I0319 16:43:38.081608 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:43:39 crc kubenswrapper[4918]: I0319 16:43:39.730648 4918 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:40 crc kubenswrapper[4918]: I0319 16:43:40.751153 4918 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:43:40 crc kubenswrapper[4918]: I0319 16:43:40.751195 4918 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:43:40 crc kubenswrapper[4918]: I0319 16:43:40.759903 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:43:40 crc kubenswrapper[4918]: I0319 16:43:40.766218 4918 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2bb25e7d-466a-49d3-ae13-ee1e7d7aed1c" Mar 19 16:43:41 crc kubenswrapper[4918]: I0319 16:43:41.758096 4918 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:43:41 crc kubenswrapper[4918]: I0319 16:43:41.758494 4918 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:43:41 crc kubenswrapper[4918]: I0319 16:43:41.860745 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:43:41 crc kubenswrapper[4918]: I0319 16:43:41.861193 4918 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 19 16:43:41 crc kubenswrapper[4918]: I0319 16:43:41.861261 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 19 16:43:48 crc kubenswrapper[4918]: I0319 16:43:48.607292 4918 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2bb25e7d-466a-49d3-ae13-ee1e7d7aed1c" Mar 19 16:43:49 crc kubenswrapper[4918]: I0319 16:43:49.517815 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 16:43:51 crc kubenswrapper[4918]: I0319 16:43:51.404050 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 16:43:51 crc kubenswrapper[4918]: I0319 16:43:51.432323 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 16:43:51 crc kubenswrapper[4918]: I0319 16:43:51.432591 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 16:43:51 crc kubenswrapper[4918]: I0319 16:43:51.473229 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 16:43:51 crc kubenswrapper[4918]: I0319 16:43:51.648567 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 16:43:51 crc kubenswrapper[4918]: I0319 16:43:51.860815 4918 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 19 16:43:51 crc kubenswrapper[4918]: I0319 16:43:51.860878 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 19 16:43:52 crc kubenswrapper[4918]: I0319 16:43:52.038850 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 16:43:52 crc kubenswrapper[4918]: I0319 16:43:52.077382 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 16:43:52 crc kubenswrapper[4918]: I0319 16:43:52.177856 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 16:43:52 crc kubenswrapper[4918]: I0319 16:43:52.313355 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 16:43:52 crc kubenswrapper[4918]: I0319 16:43:52.567920 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 16:43:52 crc kubenswrapper[4918]: I0319 16:43:52.626324 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 16:43:52 crc kubenswrapper[4918]: I0319 16:43:52.904086 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 16:43:52 crc kubenswrapper[4918]: I0319 16:43:52.922324 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 16:43:52 crc kubenswrapper[4918]: I0319 16:43:52.941395 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.002048 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.188148 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.338893 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.342284 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.346488 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.476013 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.624799 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.638887 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.766697 4918 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.798063 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.805253 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 16:43:53 crc kubenswrapper[4918]: I0319 16:43:53.978875 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.018392 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.150306 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.228378 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.293938 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.295873 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.331226 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.393585 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.417346 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.430408 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.520122 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.623180 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.649382 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.680318 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.717584 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.749261 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.857928 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.871082 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 16:43:54 crc kubenswrapper[4918]: I0319 16:43:54.894666 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 16:43:55 crc kubenswrapper[4918]: I0319 16:43:55.047875 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 16:43:55 crc kubenswrapper[4918]: I0319 16:43:55.059085 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 16:43:55 crc kubenswrapper[4918]: I0319 16:43:55.085749 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 16:43:55 crc kubenswrapper[4918]: I0319 16:43:55.124981 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 16:43:55 crc kubenswrapper[4918]: I0319 16:43:55.356421 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 16:43:55 crc kubenswrapper[4918]: I0319 16:43:55.637828 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 16:43:55 crc kubenswrapper[4918]: I0319 16:43:55.690222 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 16:43:55 crc kubenswrapper[4918]: I0319 16:43:55.760675 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 16:43:55 crc kubenswrapper[4918]: I0319 16:43:55.761594 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 16:43:55 crc kubenswrapper[4918]: I0319 16:43:55.856119 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 16:43:55 crc kubenswrapper[4918]: I0319 16:43:55.905055 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.027073 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.189799 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.303390 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.313406 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.392683 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.433174 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.438253 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.563595 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.684746 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.695616 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.706599 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.720921 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.736841 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.778963 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 16:43:56 crc kubenswrapper[4918]: I0319 16:43:56.921746 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.014784 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.020473 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.026112 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.028713 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.215108 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.244348 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.329589 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.337691 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.408000 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.709290 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.717993 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.953509 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.972384 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.982653 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 16:43:57 crc kubenswrapper[4918]: I0319 16:43:57.988184 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.031849 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.048945 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.113129 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.212151 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.212219 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.230714 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.259812 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.287232 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.370825 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.376661 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.440461 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.441924 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.504142 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.544216 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.587838 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.609670 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.644209 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.784022 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.800268 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.820962 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.822076 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 16:43:58 crc kubenswrapper[4918]: I0319 16:43:58.966986 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.013746 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.026877 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.174067 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.217423 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.259501 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.266248 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.284988 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.310990 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.347715 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.355900 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.365013 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.395711 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.426782 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.525249 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.553657 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.626055 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.718504 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.807578 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.832303 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.845963 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.847350 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.872483 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.926450 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.927161 4918 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.932468 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.943787 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 16:43:59 crc kubenswrapper[4918]: I0319 16:43:59.962302 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.045803 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.169959 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.266386 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.293613 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.328044 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.342884 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.361535 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.409216 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.428087 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.436747 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.478333 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.554872 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.668120 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.749173 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.871427 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.951250 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 16:44:00 crc kubenswrapper[4918]: I0319 16:44:00.969503 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.052369 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.095113 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.161134 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.190968 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.250783 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.262898 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.274848 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.412268 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.416738 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.507837 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.537455 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.590198 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.593789 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.680955 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.696478 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.860902 4918 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.861270 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.861409 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.862125 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"6b0a26f75a92413f8e97e03117bd9d956de1dc08120e601c9a961b0a638305b6"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.862320 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://6b0a26f75a92413f8e97e03117bd9d956de1dc08120e601c9a961b0a638305b6" gracePeriod=30 Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.876101 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.893164 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 16:44:01 crc kubenswrapper[4918]: I0319 16:44:01.978382 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.002261 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.052312 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.157311 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.164609 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.221576 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.237753 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.454104 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.572937 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.573193 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.607721 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.893260 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.904142 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 16:44:02 crc kubenswrapper[4918]: I0319 16:44:02.947896 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.093908 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.116869 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.140892 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.167420 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.179205 4918 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.191558 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.243733 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.266837 4918 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.429796 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.520728 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.745311 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 16:44:03 crc kubenswrapper[4918]: I0319 16:44:03.843597 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.007205 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.012986 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.070420 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.084843 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.098953 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.101583 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.205476 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.417467 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.439084 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.449909 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.544851 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.664978 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.702934 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.748132 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.768823 4918 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.778503 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-h9xcq"] Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.778667 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-68cb54d767-wxbkf","openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 16:44:04 crc kubenswrapper[4918]: E0319 16:44:04.779039 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" containerName="oauth-openshift" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.779082 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" containerName="oauth-openshift" Mar 19 16:44:04 crc kubenswrapper[4918]: E0319 16:44:04.779138 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" containerName="installer" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.779157 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" containerName="installer" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.779395 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a36234d-cdf5-47a0-a4dd-405a166c6ff7" containerName="installer" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.779443 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" containerName="oauth-openshift" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.779399 4918 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.779701 4918 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eaeabd68-1ae7-4595-8ec1-53ceca0d1cf7" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.780980 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.789761 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.789991 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.789991 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.790668 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.793135 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.795096 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.795136 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.796385 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.796580 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.796697 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.796830 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.797439 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.798423 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.809268 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.810573 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.815385 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.864471 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.864442795 podStartE2EDuration="25.864442795s" podCreationTimestamp="2026-03-19 16:43:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:44:04.857391158 +0000 UTC m=+256.979590446" watchObservedRunningTime="2026-03-19 16:44:04.864442795 +0000 UTC m=+256.986642083" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.864994 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-template-login\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865086 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-router-certs\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865172 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865238 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865291 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865354 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865439 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865504 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865588 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-session\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865732 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-887kz\" (UniqueName: \"kubernetes.io/projected/46456c64-dd47-4979-ac35-4c651024bcdd-kube-api-access-887kz\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865838 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-template-error\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865910 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-audit-policies\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.865959 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46456c64-dd47-4979-ac35-4c651024bcdd-audit-dir\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.866071 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-service-ca\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.870900 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.899916 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.910808 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967029 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-template-login\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967097 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-router-certs\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967136 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967170 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967206 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967246 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967300 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967359 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967418 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-session\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967501 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-887kz\" (UniqueName: \"kubernetes.io/projected/46456c64-dd47-4979-ac35-4c651024bcdd-kube-api-access-887kz\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967652 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-template-error\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967736 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-audit-policies\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967794 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46456c64-dd47-4979-ac35-4c651024bcdd-audit-dir\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.967881 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-service-ca\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.968254 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46456c64-dd47-4979-ac35-4c651024bcdd-audit-dir\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.968338 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.968619 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.969431 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-service-ca\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.970155 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/46456c64-dd47-4979-ac35-4c651024bcdd-audit-policies\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.974415 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-router-certs\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.975087 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.975330 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-template-error\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.975839 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-template-login\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.976183 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-session\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.977134 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.979390 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.982468 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/46456c64-dd47-4979-ac35-4c651024bcdd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.994924 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 16:44:04 crc kubenswrapper[4918]: I0319 16:44:04.999394 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-887kz\" (UniqueName: \"kubernetes.io/projected/46456c64-dd47-4979-ac35-4c651024bcdd-kube-api-access-887kz\") pod \"oauth-openshift-68cb54d767-wxbkf\" (UID: \"46456c64-dd47-4979-ac35-4c651024bcdd\") " pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.014589 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.111734 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.116040 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.201623 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.220397 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.316760 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.339198 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.374612 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.547770 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68cb54d767-wxbkf"] Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.564257 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.593930 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.630597 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.634292 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.673638 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.698354 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.807900 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.857605 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.926112 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" event={"ID":"46456c64-dd47-4979-ac35-4c651024bcdd","Type":"ContainerStarted","Data":"a7e3184ac74f902e79fc7334955c6696ab864209f417dd93176cfd474e44a82c"} Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.926490 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" event={"ID":"46456c64-dd47-4979-ac35-4c651024bcdd","Type":"ContainerStarted","Data":"3c8e5c3cfb9de96d1aca629569cbe01c15b5568f96444c0ab0ca008fe8345bc2"} Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.926561 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.929424 4918 patch_prober.go:28] interesting pod/oauth-openshift-68cb54d767-wxbkf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.67:6443/healthz\": dial tcp 10.217.0.67:6443: connect: connection refused" start-of-body= Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.929480 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" podUID="46456c64-dd47-4979-ac35-4c651024bcdd" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.67:6443/healthz\": dial tcp 10.217.0.67:6443: connect: connection refused" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.955879 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" podStartSLOduration=67.955843509 podStartE2EDuration="1m7.955843509s" podCreationTimestamp="2026-03-19 16:42:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:44:05.95165919 +0000 UTC m=+258.073858478" watchObservedRunningTime="2026-03-19 16:44:05.955843509 +0000 UTC m=+258.078042767" Mar 19 16:44:05 crc kubenswrapper[4918]: I0319 16:44:05.995314 4918 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 16:44:06 crc kubenswrapper[4918]: I0319 16:44:06.150837 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 16:44:06 crc kubenswrapper[4918]: I0319 16:44:06.300659 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 16:44:06 crc kubenswrapper[4918]: I0319 16:44:06.450244 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 16:44:06 crc kubenswrapper[4918]: I0319 16:44:06.459081 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 16:44:06 crc kubenswrapper[4918]: I0319 16:44:06.599693 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3fcd84-785e-4d3f-9911-1d49a4b33dc9" path="/var/lib/kubelet/pods/4f3fcd84-785e-4d3f-9911-1d49a4b33dc9/volumes" Mar 19 16:44:06 crc kubenswrapper[4918]: I0319 16:44:06.686325 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 16:44:06 crc kubenswrapper[4918]: I0319 16:44:06.940738 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68cb54d767-wxbkf" Mar 19 16:44:06 crc kubenswrapper[4918]: I0319 16:44:06.972240 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 16:44:08 crc kubenswrapper[4918]: I0319 16:44:08.071316 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 16:44:08 crc kubenswrapper[4918]: I0319 16:44:08.404172 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 16:44:09 crc kubenswrapper[4918]: I0319 16:44:09.202833 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 16:44:13 crc kubenswrapper[4918]: I0319 16:44:13.577806 4918 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 16:44:13 crc kubenswrapper[4918]: I0319 16:44:13.578612 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://89be1a8f90e5bb284cac02772f9691efe50cd2a5c83e467350c3962c98f6abf7" gracePeriod=5 Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.019710 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.020412 4918 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="89be1a8f90e5bb284cac02772f9691efe50cd2a5c83e467350c3962c98f6abf7" exitCode=137 Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.183852 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.183976 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.283074 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.283198 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.283236 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.283268 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.283345 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.283417 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.283441 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.283546 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.283933 4918 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.283959 4918 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.283978 4918 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.284283 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.324193 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.384430 4918 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 16:44:19 crc kubenswrapper[4918]: I0319 16:44:19.384488 4918 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 16:44:20 crc kubenswrapper[4918]: I0319 16:44:20.031376 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 16:44:20 crc kubenswrapper[4918]: I0319 16:44:20.031739 4918 scope.go:117] "RemoveContainer" containerID="89be1a8f90e5bb284cac02772f9691efe50cd2a5c83e467350c3962c98f6abf7" Mar 19 16:44:20 crc kubenswrapper[4918]: I0319 16:44:20.031801 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 16:44:20 crc kubenswrapper[4918]: I0319 16:44:20.034337 4918 generic.go:334] "Generic (PLEG): container finished" podID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" containerID="3ad2812d254603ca69fe3bee0077005d9620b6a9600d3253e5c2fcb15a4bf12c" exitCode=0 Mar 19 16:44:20 crc kubenswrapper[4918]: I0319 16:44:20.034395 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" event={"ID":"f5b832a4-7fec-4d2f-a400-1e890bb551b4","Type":"ContainerDied","Data":"3ad2812d254603ca69fe3bee0077005d9620b6a9600d3253e5c2fcb15a4bf12c"} Mar 19 16:44:20 crc kubenswrapper[4918]: I0319 16:44:20.035087 4918 scope.go:117] "RemoveContainer" containerID="3ad2812d254603ca69fe3bee0077005d9620b6a9600d3253e5c2fcb15a4bf12c" Mar 19 16:44:20 crc kubenswrapper[4918]: I0319 16:44:20.594823 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 19 16:44:21 crc kubenswrapper[4918]: I0319 16:44:21.043775 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" event={"ID":"f5b832a4-7fec-4d2f-a400-1e890bb551b4","Type":"ContainerStarted","Data":"fbc30e39090d337b317735bca1ab01e72dd73771d3c1be6ffc272cb539900eb5"} Mar 19 16:44:21 crc kubenswrapper[4918]: I0319 16:44:21.044987 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:44:21 crc kubenswrapper[4918]: I0319 16:44:21.046847 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:44:28 crc kubenswrapper[4918]: I0319 16:44:28.212512 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:44:28 crc kubenswrapper[4918]: I0319 16:44:28.213741 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:44:28 crc kubenswrapper[4918]: I0319 16:44:28.213831 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:44:28 crc kubenswrapper[4918]: I0319 16:44:28.214948 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d840761dfd614dd8a3c1473b1242185e860c0c959af7cb2cf7f9c58ed3dceb0"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 16:44:28 crc kubenswrapper[4918]: I0319 16:44:28.215079 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://4d840761dfd614dd8a3c1473b1242185e860c0c959af7cb2cf7f9c58ed3dceb0" gracePeriod=600 Mar 19 16:44:29 crc kubenswrapper[4918]: I0319 16:44:29.094199 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="4d840761dfd614dd8a3c1473b1242185e860c0c959af7cb2cf7f9c58ed3dceb0" exitCode=0 Mar 19 16:44:29 crc kubenswrapper[4918]: I0319 16:44:29.094304 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"4d840761dfd614dd8a3c1473b1242185e860c0c959af7cb2cf7f9c58ed3dceb0"} Mar 19 16:44:30 crc kubenswrapper[4918]: I0319 16:44:30.104471 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"860bd4886eb08f73670eccdb27b89ffff79615ee22a0d3ebbad5577184aa8a7e"} Mar 19 16:44:32 crc kubenswrapper[4918]: I0319 16:44:32.125070 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 19 16:44:32 crc kubenswrapper[4918]: I0319 16:44:32.128294 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 16:44:32 crc kubenswrapper[4918]: I0319 16:44:32.129602 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 16:44:32 crc kubenswrapper[4918]: I0319 16:44:32.129642 4918 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6b0a26f75a92413f8e97e03117bd9d956de1dc08120e601c9a961b0a638305b6" exitCode=137 Mar 19 16:44:32 crc kubenswrapper[4918]: I0319 16:44:32.129673 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6b0a26f75a92413f8e97e03117bd9d956de1dc08120e601c9a961b0a638305b6"} Mar 19 16:44:32 crc kubenswrapper[4918]: I0319 16:44:32.129707 4918 scope.go:117] "RemoveContainer" containerID="2abd1b66985995e1a44fc90e80d1a0acca10e7d483e7aca71531747026fb6a2e" Mar 19 16:44:33 crc kubenswrapper[4918]: I0319 16:44:33.136834 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 19 16:44:33 crc kubenswrapper[4918]: I0319 16:44:33.137597 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 16:44:33 crc kubenswrapper[4918]: I0319 16:44:33.138358 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ede0dc367d8e1f1b482edbea0fad34501c09e6e81cf77065254fdb793c6a936f"} Mar 19 16:44:38 crc kubenswrapper[4918]: I0319 16:44:38.080734 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:44:41 crc kubenswrapper[4918]: I0319 16:44:41.866393 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:44:41 crc kubenswrapper[4918]: I0319 16:44:41.877775 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:44:42 crc kubenswrapper[4918]: I0319 16:44:42.194738 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 16:44:52 crc kubenswrapper[4918]: I0319 16:44:52.959770 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565644-5c8pr"] Mar 19 16:44:52 crc kubenswrapper[4918]: E0319 16:44:52.960325 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 16:44:52 crc kubenswrapper[4918]: I0319 16:44:52.960337 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 16:44:52 crc kubenswrapper[4918]: I0319 16:44:52.960425 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 16:44:52 crc kubenswrapper[4918]: I0319 16:44:52.960788 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565644-5c8pr" Mar 19 16:44:52 crc kubenswrapper[4918]: I0319 16:44:52.969794 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:44:52 crc kubenswrapper[4918]: I0319 16:44:52.970082 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 16:44:52 crc kubenswrapper[4918]: I0319 16:44:52.978000 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:44:52 crc kubenswrapper[4918]: I0319 16:44:52.986068 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565644-5c8pr"] Mar 19 16:44:53 crc kubenswrapper[4918]: I0319 16:44:53.022211 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvgct\" (UniqueName: \"kubernetes.io/projected/0f75fcd6-ecca-47f1-84dd-c3521d1f9583-kube-api-access-tvgct\") pod \"auto-csr-approver-29565644-5c8pr\" (UID: \"0f75fcd6-ecca-47f1-84dd-c3521d1f9583\") " pod="openshift-infra/auto-csr-approver-29565644-5c8pr" Mar 19 16:44:53 crc kubenswrapper[4918]: I0319 16:44:53.123466 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvgct\" (UniqueName: \"kubernetes.io/projected/0f75fcd6-ecca-47f1-84dd-c3521d1f9583-kube-api-access-tvgct\") pod \"auto-csr-approver-29565644-5c8pr\" (UID: \"0f75fcd6-ecca-47f1-84dd-c3521d1f9583\") " pod="openshift-infra/auto-csr-approver-29565644-5c8pr" Mar 19 16:44:53 crc kubenswrapper[4918]: I0319 16:44:53.156860 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvgct\" (UniqueName: \"kubernetes.io/projected/0f75fcd6-ecca-47f1-84dd-c3521d1f9583-kube-api-access-tvgct\") pod \"auto-csr-approver-29565644-5c8pr\" (UID: \"0f75fcd6-ecca-47f1-84dd-c3521d1f9583\") " pod="openshift-infra/auto-csr-approver-29565644-5c8pr" Mar 19 16:44:53 crc kubenswrapper[4918]: I0319 16:44:53.278091 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565644-5c8pr" Mar 19 16:44:53 crc kubenswrapper[4918]: I0319 16:44:53.655988 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565644-5c8pr"] Mar 19 16:44:54 crc kubenswrapper[4918]: I0319 16:44:54.262979 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565644-5c8pr" event={"ID":"0f75fcd6-ecca-47f1-84dd-c3521d1f9583","Type":"ContainerStarted","Data":"657218ee0a233d13883aff83b8b79bdb547be5127ca9c214bd0689f1d796acb2"} Mar 19 16:44:56 crc kubenswrapper[4918]: I0319 16:44:56.276893 4918 generic.go:334] "Generic (PLEG): container finished" podID="0f75fcd6-ecca-47f1-84dd-c3521d1f9583" containerID="1a8629774e93e4581b9954cfe6afcd278d1928f9fd7274d68dc6f279e8e0afe5" exitCode=0 Mar 19 16:44:56 crc kubenswrapper[4918]: I0319 16:44:56.276984 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565644-5c8pr" event={"ID":"0f75fcd6-ecca-47f1-84dd-c3521d1f9583","Type":"ContainerDied","Data":"1a8629774e93e4581b9954cfe6afcd278d1928f9fd7274d68dc6f279e8e0afe5"} Mar 19 16:44:57 crc kubenswrapper[4918]: I0319 16:44:57.647466 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565644-5c8pr" Mar 19 16:44:57 crc kubenswrapper[4918]: I0319 16:44:57.682498 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvgct\" (UniqueName: \"kubernetes.io/projected/0f75fcd6-ecca-47f1-84dd-c3521d1f9583-kube-api-access-tvgct\") pod \"0f75fcd6-ecca-47f1-84dd-c3521d1f9583\" (UID: \"0f75fcd6-ecca-47f1-84dd-c3521d1f9583\") " Mar 19 16:44:57 crc kubenswrapper[4918]: I0319 16:44:57.689451 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f75fcd6-ecca-47f1-84dd-c3521d1f9583-kube-api-access-tvgct" (OuterVolumeSpecName: "kube-api-access-tvgct") pod "0f75fcd6-ecca-47f1-84dd-c3521d1f9583" (UID: "0f75fcd6-ecca-47f1-84dd-c3521d1f9583"). InnerVolumeSpecName "kube-api-access-tvgct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:44:57 crc kubenswrapper[4918]: I0319 16:44:57.784272 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvgct\" (UniqueName: \"kubernetes.io/projected/0f75fcd6-ecca-47f1-84dd-c3521d1f9583-kube-api-access-tvgct\") on node \"crc\" DevicePath \"\"" Mar 19 16:44:58 crc kubenswrapper[4918]: I0319 16:44:58.291463 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565644-5c8pr" event={"ID":"0f75fcd6-ecca-47f1-84dd-c3521d1f9583","Type":"ContainerDied","Data":"657218ee0a233d13883aff83b8b79bdb547be5127ca9c214bd0689f1d796acb2"} Mar 19 16:44:58 crc kubenswrapper[4918]: I0319 16:44:58.291509 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565644-5c8pr" Mar 19 16:44:58 crc kubenswrapper[4918]: I0319 16:44:58.291586 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="657218ee0a233d13883aff83b8b79bdb547be5127ca9c214bd0689f1d796acb2" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.130337 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh"] Mar 19 16:45:00 crc kubenswrapper[4918]: E0319 16:45:00.130610 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f75fcd6-ecca-47f1-84dd-c3521d1f9583" containerName="oc" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.130625 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f75fcd6-ecca-47f1-84dd-c3521d1f9583" containerName="oc" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.130760 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f75fcd6-ecca-47f1-84dd-c3521d1f9583" containerName="oc" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.131215 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.133570 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.133791 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.136991 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh"] Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.217554 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-secret-volume\") pod \"collect-profiles-29565645-glllh\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.217684 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxbx\" (UniqueName: \"kubernetes.io/projected/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-kube-api-access-gxxbx\") pod \"collect-profiles-29565645-glllh\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.217723 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-config-volume\") pod \"collect-profiles-29565645-glllh\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.319147 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-secret-volume\") pod \"collect-profiles-29565645-glllh\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.319255 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxbx\" (UniqueName: \"kubernetes.io/projected/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-kube-api-access-gxxbx\") pod \"collect-profiles-29565645-glllh\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.319295 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-config-volume\") pod \"collect-profiles-29565645-glllh\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.320660 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-config-volume\") pod \"collect-profiles-29565645-glllh\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.326663 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-secret-volume\") pod \"collect-profiles-29565645-glllh\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.346432 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxbx\" (UniqueName: \"kubernetes.io/projected/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-kube-api-access-gxxbx\") pod \"collect-profiles-29565645-glllh\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.451547 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:00 crc kubenswrapper[4918]: I0319 16:45:00.838928 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh"] Mar 19 16:45:01 crc kubenswrapper[4918]: I0319 16:45:01.310128 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" event={"ID":"f1df102a-9829-4f44-a21e-f9c72d0bd2e8","Type":"ContainerStarted","Data":"088d0c4bda1be07221b1a654e10a4be02d14a7439b3088a6d58ab1635f7bcf26"} Mar 19 16:45:01 crc kubenswrapper[4918]: I0319 16:45:01.310170 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" event={"ID":"f1df102a-9829-4f44-a21e-f9c72d0bd2e8","Type":"ContainerStarted","Data":"d59dceede49e268d2d5c6aa0663beb6f632da94b25f5861feb6e5fc01d5302c1"} Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.316635 4918 generic.go:334] "Generic (PLEG): container finished" podID="f1df102a-9829-4f44-a21e-f9c72d0bd2e8" containerID="088d0c4bda1be07221b1a654e10a4be02d14a7439b3088a6d58ab1635f7bcf26" exitCode=0 Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.316737 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" event={"ID":"f1df102a-9829-4f44-a21e-f9c72d0bd2e8","Type":"ContainerDied","Data":"088d0c4bda1be07221b1a654e10a4be02d14a7439b3088a6d58ab1635f7bcf26"} Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.554475 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.649126 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxxbx\" (UniqueName: \"kubernetes.io/projected/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-kube-api-access-gxxbx\") pod \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.649412 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-config-volume\") pod \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.649447 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-secret-volume\") pod \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\" (UID: \"f1df102a-9829-4f44-a21e-f9c72d0bd2e8\") " Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.650105 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-config-volume" (OuterVolumeSpecName: "config-volume") pod "f1df102a-9829-4f44-a21e-f9c72d0bd2e8" (UID: "f1df102a-9829-4f44-a21e-f9c72d0bd2e8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.654265 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f1df102a-9829-4f44-a21e-f9c72d0bd2e8" (UID: "f1df102a-9829-4f44-a21e-f9c72d0bd2e8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.654449 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-kube-api-access-gxxbx" (OuterVolumeSpecName: "kube-api-access-gxxbx") pod "f1df102a-9829-4f44-a21e-f9c72d0bd2e8" (UID: "f1df102a-9829-4f44-a21e-f9c72d0bd2e8"). InnerVolumeSpecName "kube-api-access-gxxbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.750486 4918 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.750745 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxxbx\" (UniqueName: \"kubernetes.io/projected/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-kube-api-access-gxxbx\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:02 crc kubenswrapper[4918]: I0319 16:45:02.750809 4918 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1df102a-9829-4f44-a21e-f9c72d0bd2e8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:03 crc kubenswrapper[4918]: I0319 16:45:03.325054 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" event={"ID":"f1df102a-9829-4f44-a21e-f9c72d0bd2e8","Type":"ContainerDied","Data":"d59dceede49e268d2d5c6aa0663beb6f632da94b25f5861feb6e5fc01d5302c1"} Mar 19 16:45:03 crc kubenswrapper[4918]: I0319 16:45:03.325090 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d59dceede49e268d2d5c6aa0663beb6f632da94b25f5861feb6e5fc01d5302c1" Mar 19 16:45:03 crc kubenswrapper[4918]: I0319 16:45:03.326417 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.567730 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x7x85"] Mar 19 16:45:37 crc kubenswrapper[4918]: E0319 16:45:37.568580 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1df102a-9829-4f44-a21e-f9c72d0bd2e8" containerName="collect-profiles" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.568597 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1df102a-9829-4f44-a21e-f9c72d0bd2e8" containerName="collect-profiles" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.568740 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1df102a-9829-4f44-a21e-f9c72d0bd2e8" containerName="collect-profiles" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.569213 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.595873 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x7x85"] Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.703074 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-trusted-ca\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.703176 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.703211 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-registry-certificates\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.703252 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-registry-tls\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.703305 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnhwm\" (UniqueName: \"kubernetes.io/projected/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-kube-api-access-wnhwm\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.703341 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-bound-sa-token\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.703447 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.703512 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.728716 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.804425 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnhwm\" (UniqueName: \"kubernetes.io/projected/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-kube-api-access-wnhwm\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.804467 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-bound-sa-token\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.804498 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.804561 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-trusted-ca\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.804590 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.804606 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-registry-certificates\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.804625 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-registry-tls\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.805173 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.806075 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-trusted-ca\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.806132 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-registry-certificates\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.810261 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.810436 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-registry-tls\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.821780 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-bound-sa-token\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.827422 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnhwm\" (UniqueName: \"kubernetes.io/projected/f89ffacb-2170-402d-aaf8-c0c4f3cf7e67-kube-api-access-wnhwm\") pod \"image-registry-66df7c8f76-x7x85\" (UID: \"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67\") " pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:37 crc kubenswrapper[4918]: I0319 16:45:37.894647 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:38 crc kubenswrapper[4918]: I0319 16:45:38.112095 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x7x85"] Mar 19 16:45:38 crc kubenswrapper[4918]: I0319 16:45:38.558605 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" event={"ID":"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67","Type":"ContainerStarted","Data":"c56020e81bbc629a639b8902f4b839eb55ac7dfc482707728fedd9eea27ddad8"} Mar 19 16:45:38 crc kubenswrapper[4918]: I0319 16:45:38.559318 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:38 crc kubenswrapper[4918]: I0319 16:45:38.559335 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" event={"ID":"f89ffacb-2170-402d-aaf8-c0c4f3cf7e67","Type":"ContainerStarted","Data":"5943189d84e7b399d46f7cd28e462dfc68146efd000fe89fb111fe04e19985ea"} Mar 19 16:45:38 crc kubenswrapper[4918]: I0319 16:45:38.577842 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" podStartSLOduration=1.57781807 podStartE2EDuration="1.57781807s" podCreationTimestamp="2026-03-19 16:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:45:38.576286788 +0000 UTC m=+350.698486076" watchObservedRunningTime="2026-03-19 16:45:38.57781807 +0000 UTC m=+350.700017328" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.360982 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2zbj"] Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.362351 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s2zbj" podUID="66b9142f-4eaf-41a0-9b13-dae083686eec" containerName="registry-server" containerID="cri-o://ad5ea550adeccefa0e734a856a4077011856223de2e08f25efa1caa9cc492bb3" gracePeriod=30 Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.371258 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6hdb4"] Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.371721 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6hdb4" podUID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" containerName="registry-server" containerID="cri-o://9c27a12e790321ff49c5ea34c3a756d01ed13ff046850d1f89a596fbdfc236f7" gracePeriod=30 Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.391990 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k4qc7"] Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.392447 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" podUID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" containerName="marketplace-operator" containerID="cri-o://fbc30e39090d337b317735bca1ab01e72dd73771d3c1be6ffc272cb539900eb5" gracePeriod=30 Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.399979 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5nsgc"] Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.400377 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5nsgc" podUID="8824182c-653f-4719-87ac-38d3c9c44f12" containerName="registry-server" containerID="cri-o://0a0030c412194e20579b75921e05df9f5dde86ed036595c8129758e3db03584e" gracePeriod=30 Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.408774 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7glsh"] Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.409042 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7glsh" podUID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerName="registry-server" containerID="cri-o://9af6448e1c79b7f030c4cfd030c56cfd05da2e4eb805013daafe958d087eac2c" gracePeriod=30 Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.422107 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fv5bd"] Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.423798 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.454234 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fv5bd"] Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.536345 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/028abe48-67cc-4c05-b8ae-cd5979b55787-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fv5bd\" (UID: \"028abe48-67cc-4c05-b8ae-cd5979b55787\") " pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.536762 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/028abe48-67cc-4c05-b8ae-cd5979b55787-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fv5bd\" (UID: \"028abe48-67cc-4c05-b8ae-cd5979b55787\") " pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.536784 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsbcj\" (UniqueName: \"kubernetes.io/projected/028abe48-67cc-4c05-b8ae-cd5979b55787-kube-api-access-fsbcj\") pod \"marketplace-operator-79b997595-fv5bd\" (UID: \"028abe48-67cc-4c05-b8ae-cd5979b55787\") " pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.570898 4918 generic.go:334] "Generic (PLEG): container finished" podID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" containerID="9c27a12e790321ff49c5ea34c3a756d01ed13ff046850d1f89a596fbdfc236f7" exitCode=0 Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.571208 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hdb4" event={"ID":"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4","Type":"ContainerDied","Data":"9c27a12e790321ff49c5ea34c3a756d01ed13ff046850d1f89a596fbdfc236f7"} Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.573006 4918 generic.go:334] "Generic (PLEG): container finished" podID="66b9142f-4eaf-41a0-9b13-dae083686eec" containerID="ad5ea550adeccefa0e734a856a4077011856223de2e08f25efa1caa9cc492bb3" exitCode=0 Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.573037 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2zbj" event={"ID":"66b9142f-4eaf-41a0-9b13-dae083686eec","Type":"ContainerDied","Data":"ad5ea550adeccefa0e734a856a4077011856223de2e08f25efa1caa9cc492bb3"} Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.574370 4918 generic.go:334] "Generic (PLEG): container finished" podID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" containerID="fbc30e39090d337b317735bca1ab01e72dd73771d3c1be6ffc272cb539900eb5" exitCode=0 Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.574434 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" event={"ID":"f5b832a4-7fec-4d2f-a400-1e890bb551b4","Type":"ContainerDied","Data":"fbc30e39090d337b317735bca1ab01e72dd73771d3c1be6ffc272cb539900eb5"} Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.574456 4918 scope.go:117] "RemoveContainer" containerID="3ad2812d254603ca69fe3bee0077005d9620b6a9600d3253e5c2fcb15a4bf12c" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.579064 4918 generic.go:334] "Generic (PLEG): container finished" podID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerID="9af6448e1c79b7f030c4cfd030c56cfd05da2e4eb805013daafe958d087eac2c" exitCode=0 Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.579127 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7glsh" event={"ID":"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa","Type":"ContainerDied","Data":"9af6448e1c79b7f030c4cfd030c56cfd05da2e4eb805013daafe958d087eac2c"} Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.581125 4918 generic.go:334] "Generic (PLEG): container finished" podID="8824182c-653f-4719-87ac-38d3c9c44f12" containerID="0a0030c412194e20579b75921e05df9f5dde86ed036595c8129758e3db03584e" exitCode=0 Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.581183 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nsgc" event={"ID":"8824182c-653f-4719-87ac-38d3c9c44f12","Type":"ContainerDied","Data":"0a0030c412194e20579b75921e05df9f5dde86ed036595c8129758e3db03584e"} Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.638303 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/028abe48-67cc-4c05-b8ae-cd5979b55787-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fv5bd\" (UID: \"028abe48-67cc-4c05-b8ae-cd5979b55787\") " pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.638419 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/028abe48-67cc-4c05-b8ae-cd5979b55787-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fv5bd\" (UID: \"028abe48-67cc-4c05-b8ae-cd5979b55787\") " pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.638450 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsbcj\" (UniqueName: \"kubernetes.io/projected/028abe48-67cc-4c05-b8ae-cd5979b55787-kube-api-access-fsbcj\") pod \"marketplace-operator-79b997595-fv5bd\" (UID: \"028abe48-67cc-4c05-b8ae-cd5979b55787\") " pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.641896 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/028abe48-67cc-4c05-b8ae-cd5979b55787-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fv5bd\" (UID: \"028abe48-67cc-4c05-b8ae-cd5979b55787\") " pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.654771 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/028abe48-67cc-4c05-b8ae-cd5979b55787-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fv5bd\" (UID: \"028abe48-67cc-4c05-b8ae-cd5979b55787\") " pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.670608 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsbcj\" (UniqueName: \"kubernetes.io/projected/028abe48-67cc-4c05-b8ae-cd5979b55787-kube-api-access-fsbcj\") pod \"marketplace-operator-79b997595-fv5bd\" (UID: \"028abe48-67cc-4c05-b8ae-cd5979b55787\") " pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:39 crc kubenswrapper[4918]: I0319 16:45:39.765321 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.056464 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.100653 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.118213 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.146595 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-operator-metrics\") pod \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.146637 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-trusted-ca\") pod \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.146705 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgnqk\" (UniqueName: \"kubernetes.io/projected/f5b832a4-7fec-4d2f-a400-1e890bb551b4-kube-api-access-sgnqk\") pod \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\" (UID: \"f5b832a4-7fec-4d2f-a400-1e890bb551b4\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.147580 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f5b832a4-7fec-4d2f-a400-1e890bb551b4" (UID: "f5b832a4-7fec-4d2f-a400-1e890bb551b4"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.147656 4918 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.150689 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b832a4-7fec-4d2f-a400-1e890bb551b4-kube-api-access-sgnqk" (OuterVolumeSpecName: "kube-api-access-sgnqk") pod "f5b832a4-7fec-4d2f-a400-1e890bb551b4" (UID: "f5b832a4-7fec-4d2f-a400-1e890bb551b4"). InnerVolumeSpecName "kube-api-access-sgnqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.151214 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f5b832a4-7fec-4d2f-a400-1e890bb551b4" (UID: "f5b832a4-7fec-4d2f-a400-1e890bb551b4"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.248193 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-catalog-content\") pod \"8824182c-653f-4719-87ac-38d3c9c44f12\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.255067 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmx82\" (UniqueName: \"kubernetes.io/projected/8824182c-653f-4719-87ac-38d3c9c44f12-kube-api-access-rmx82\") pod \"8824182c-653f-4719-87ac-38d3c9c44f12\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.255122 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-utilities\") pod \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.255177 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nlcn\" (UniqueName: \"kubernetes.io/projected/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-kube-api-access-6nlcn\") pod \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.255197 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-catalog-content\") pod \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\" (UID: \"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.255230 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-utilities\") pod \"8824182c-653f-4719-87ac-38d3c9c44f12\" (UID: \"8824182c-653f-4719-87ac-38d3c9c44f12\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.258959 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-utilities" (OuterVolumeSpecName: "utilities") pod "3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" (UID: "3a9fdbd8-ec5d-4aa7-8097-a081455a27fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.259856 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-utilities" (OuterVolumeSpecName: "utilities") pod "8824182c-653f-4719-87ac-38d3c9c44f12" (UID: "8824182c-653f-4719-87ac-38d3c9c44f12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.262673 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8824182c-653f-4719-87ac-38d3c9c44f12-kube-api-access-rmx82" (OuterVolumeSpecName: "kube-api-access-rmx82") pod "8824182c-653f-4719-87ac-38d3c9c44f12" (UID: "8824182c-653f-4719-87ac-38d3c9c44f12"). InnerVolumeSpecName "kube-api-access-rmx82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.262740 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-kube-api-access-6nlcn" (OuterVolumeSpecName: "kube-api-access-6nlcn") pod "3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" (UID: "3a9fdbd8-ec5d-4aa7-8097-a081455a27fa"). InnerVolumeSpecName "kube-api-access-6nlcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.263238 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgnqk\" (UniqueName: \"kubernetes.io/projected/f5b832a4-7fec-4d2f-a400-1e890bb551b4-kube-api-access-sgnqk\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.263257 4918 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5b832a4-7fec-4d2f-a400-1e890bb551b4-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.281378 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fv5bd"] Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.299035 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8824182c-653f-4719-87ac-38d3c9c44f12" (UID: "8824182c-653f-4719-87ac-38d3c9c44f12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.317050 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.325648 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.366353 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.366651 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmx82\" (UniqueName: \"kubernetes.io/projected/8824182c-653f-4719-87ac-38d3c9c44f12-kube-api-access-rmx82\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.366776 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.366865 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nlcn\" (UniqueName: \"kubernetes.io/projected/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-kube-api-access-6nlcn\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.366955 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824182c-653f-4719-87ac-38d3c9c44f12-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.432812 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" (UID: "3a9fdbd8-ec5d-4aa7-8097-a081455a27fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.468082 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-catalog-content\") pod \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.468141 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-utilities\") pod \"66b9142f-4eaf-41a0-9b13-dae083686eec\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.468165 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-catalog-content\") pod \"66b9142f-4eaf-41a0-9b13-dae083686eec\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.468228 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s2tf\" (UniqueName: \"kubernetes.io/projected/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-kube-api-access-5s2tf\") pod \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.468257 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-utilities\") pod \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\" (UID: \"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.468342 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8zbj\" (UniqueName: \"kubernetes.io/projected/66b9142f-4eaf-41a0-9b13-dae083686eec-kube-api-access-t8zbj\") pod \"66b9142f-4eaf-41a0-9b13-dae083686eec\" (UID: \"66b9142f-4eaf-41a0-9b13-dae083686eec\") " Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.468584 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.468902 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-utilities" (OuterVolumeSpecName: "utilities") pod "66b9142f-4eaf-41a0-9b13-dae083686eec" (UID: "66b9142f-4eaf-41a0-9b13-dae083686eec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.471580 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-utilities" (OuterVolumeSpecName: "utilities") pod "ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" (UID: "ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.473404 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-kube-api-access-5s2tf" (OuterVolumeSpecName: "kube-api-access-5s2tf") pod "ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" (UID: "ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4"). InnerVolumeSpecName "kube-api-access-5s2tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.475081 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b9142f-4eaf-41a0-9b13-dae083686eec-kube-api-access-t8zbj" (OuterVolumeSpecName: "kube-api-access-t8zbj") pod "66b9142f-4eaf-41a0-9b13-dae083686eec" (UID: "66b9142f-4eaf-41a0-9b13-dae083686eec"). InnerVolumeSpecName "kube-api-access-t8zbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.528024 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66b9142f-4eaf-41a0-9b13-dae083686eec" (UID: "66b9142f-4eaf-41a0-9b13-dae083686eec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.529064 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" (UID: "ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.569497 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8zbj\" (UniqueName: \"kubernetes.io/projected/66b9142f-4eaf-41a0-9b13-dae083686eec-kube-api-access-t8zbj\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.569580 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.569594 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.569609 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b9142f-4eaf-41a0-9b13-dae083686eec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.569623 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s2tf\" (UniqueName: \"kubernetes.io/projected/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-kube-api-access-5s2tf\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.569635 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.589928 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5nsgc" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.596394 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6hdb4" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.599329 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2zbj" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.602617 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.603680 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5nsgc" event={"ID":"8824182c-653f-4719-87ac-38d3c9c44f12","Type":"ContainerDied","Data":"ff8e4ab1a28d5742d0155fe5f5279973224bf30b08b6516a3bd9872c7136f06b"} Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.603714 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hdb4" event={"ID":"ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4","Type":"ContainerDied","Data":"0528cfd58a2100758dab6613ae9f5506be96ade583ebefc128dda6f2ee0d8e93"} Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.603727 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2zbj" event={"ID":"66b9142f-4eaf-41a0-9b13-dae083686eec","Type":"ContainerDied","Data":"e065d697832d51a2809ef9242e62343d931ddbd8c3b365284b8a396811867840"} Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.603794 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k4qc7" event={"ID":"f5b832a4-7fec-4d2f-a400-1e890bb551b4","Type":"ContainerDied","Data":"a5948eeb7e6d17622dbae1d313e27afd9cdb3cf2cf44c3641ef49c328e1daacd"} Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.603814 4918 scope.go:117] "RemoveContainer" containerID="0a0030c412194e20579b75921e05df9f5dde86ed036595c8129758e3db03584e" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.605822 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" event={"ID":"028abe48-67cc-4c05-b8ae-cd5979b55787","Type":"ContainerStarted","Data":"33ebbbe2a5224e6bd42aba4a831da0179833746a382a686773e6c5f2194fd8e1"} Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.605885 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" event={"ID":"028abe48-67cc-4c05-b8ae-cd5979b55787","Type":"ContainerStarted","Data":"b79b5d666466a5a8399dde976899d8b4cc777a5f60e6c3a6797acf6a8218449f"} Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.606216 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.609953 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7glsh" event={"ID":"3a9fdbd8-ec5d-4aa7-8097-a081455a27fa","Type":"ContainerDied","Data":"4286954389cf85ccaaa67dd0f87e2a031fb44302ea1354a78f1de188c1226df8"} Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.610019 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7glsh" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.610744 4918 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fv5bd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.610832 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" podUID="028abe48-67cc-4c05-b8ae-cd5979b55787" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.622087 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" podStartSLOduration=1.622073083 podStartE2EDuration="1.622073083s" podCreationTimestamp="2026-03-19 16:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:45:40.619956865 +0000 UTC m=+352.742156143" watchObservedRunningTime="2026-03-19 16:45:40.622073083 +0000 UTC m=+352.744272331" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.665125 4918 scope.go:117] "RemoveContainer" containerID="edf7ebf7859a002a68c93d6a3bfd5bab07826c0586181fde7d9359e9baa37df7" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.665259 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5nsgc"] Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.671469 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5nsgc"] Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.692279 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2zbj"] Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.702717 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s2zbj"] Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.702812 4918 scope.go:117] "RemoveContainer" containerID="3d3d3944cac444fb95dad3f5c3e158e3a9ef47f44c33e2b09f788035ba359b30" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.706922 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6hdb4"] Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.712861 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6hdb4"] Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.718447 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7glsh"] Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.721292 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7glsh"] Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.725507 4918 scope.go:117] "RemoveContainer" containerID="9c27a12e790321ff49c5ea34c3a756d01ed13ff046850d1f89a596fbdfc236f7" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.743635 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k4qc7"] Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.743702 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k4qc7"] Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.770502 4918 scope.go:117] "RemoveContainer" containerID="a1ae6d021391564a1ef8fbe0f610d4d4f2fcbefeb1a37d5931cedac59097e0af" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.794994 4918 scope.go:117] "RemoveContainer" containerID="a1184f0c2c04e5fc171b53c53d76e9a3db95a14c25fa1a687179efcec6de688e" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.816370 4918 scope.go:117] "RemoveContainer" containerID="ad5ea550adeccefa0e734a856a4077011856223de2e08f25efa1caa9cc492bb3" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.831634 4918 scope.go:117] "RemoveContainer" containerID="d97c9d4c710a4bc3bcdbcc26cccb937b22c6465057dfe074a222714e46e11dad" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.846718 4918 scope.go:117] "RemoveContainer" containerID="86a1ea060fa744e5670c416a1a11e9ad78ab1d157a8d3f65711eac9b49e105de" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.859642 4918 scope.go:117] "RemoveContainer" containerID="fbc30e39090d337b317735bca1ab01e72dd73771d3c1be6ffc272cb539900eb5" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.877147 4918 scope.go:117] "RemoveContainer" containerID="9af6448e1c79b7f030c4cfd030c56cfd05da2e4eb805013daafe958d087eac2c" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.889236 4918 scope.go:117] "RemoveContainer" containerID="cbcbd44531d348d0ec89b670f88d0757c51e794769ee83252578f03ef3c2ba3e" Mar 19 16:45:40 crc kubenswrapper[4918]: I0319 16:45:40.913877 4918 scope.go:117] "RemoveContainer" containerID="25e01fc3acb7af8838d288d6613ef10f52abf0e5bdcb50f724e1f3ba08e25df9" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380430 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zzhsz"] Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380742 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b9142f-4eaf-41a0-9b13-dae083686eec" containerName="extract-utilities" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380759 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b9142f-4eaf-41a0-9b13-dae083686eec" containerName="extract-utilities" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380777 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380785 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380796 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerName="extract-content" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380805 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerName="extract-content" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380816 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8824182c-653f-4719-87ac-38d3c9c44f12" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380823 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="8824182c-653f-4719-87ac-38d3c9c44f12" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380834 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" containerName="extract-utilities" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380843 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" containerName="extract-utilities" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380854 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8824182c-653f-4719-87ac-38d3c9c44f12" containerName="extract-content" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380861 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="8824182c-653f-4719-87ac-38d3c9c44f12" containerName="extract-content" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380872 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8824182c-653f-4719-87ac-38d3c9c44f12" containerName="extract-utilities" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380879 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="8824182c-653f-4719-87ac-38d3c9c44f12" containerName="extract-utilities" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380889 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" containerName="marketplace-operator" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380897 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" containerName="marketplace-operator" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380909 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b9142f-4eaf-41a0-9b13-dae083686eec" containerName="extract-content" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380918 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b9142f-4eaf-41a0-9b13-dae083686eec" containerName="extract-content" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380928 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerName="extract-utilities" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380935 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerName="extract-utilities" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380948 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b9142f-4eaf-41a0-9b13-dae083686eec" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380955 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b9142f-4eaf-41a0-9b13-dae083686eec" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380968 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380975 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.380986 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" containerName="extract-content" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.380993 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" containerName="extract-content" Mar 19 16:45:41 crc kubenswrapper[4918]: E0319 16:45:41.381004 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" containerName="marketplace-operator" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.381014 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" containerName="marketplace-operator" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.381119 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.381130 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="8824182c-653f-4719-87ac-38d3c9c44f12" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.381144 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" containerName="marketplace-operator" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.381153 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.381167 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b9142f-4eaf-41a0-9b13-dae083686eec" containerName="registry-server" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.381178 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" containerName="marketplace-operator" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.382124 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.384448 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.389354 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzhsz"] Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.482556 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8522aab6-de35-499c-bfe1-55ff5c72fbc6-utilities\") pod \"redhat-marketplace-zzhsz\" (UID: \"8522aab6-de35-499c-bfe1-55ff5c72fbc6\") " pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.482844 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp7xh\" (UniqueName: \"kubernetes.io/projected/8522aab6-de35-499c-bfe1-55ff5c72fbc6-kube-api-access-vp7xh\") pod \"redhat-marketplace-zzhsz\" (UID: \"8522aab6-de35-499c-bfe1-55ff5c72fbc6\") " pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.482946 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8522aab6-de35-499c-bfe1-55ff5c72fbc6-catalog-content\") pod \"redhat-marketplace-zzhsz\" (UID: \"8522aab6-de35-499c-bfe1-55ff5c72fbc6\") " pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.583843 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8522aab6-de35-499c-bfe1-55ff5c72fbc6-utilities\") pod \"redhat-marketplace-zzhsz\" (UID: \"8522aab6-de35-499c-bfe1-55ff5c72fbc6\") " pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.583969 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp7xh\" (UniqueName: \"kubernetes.io/projected/8522aab6-de35-499c-bfe1-55ff5c72fbc6-kube-api-access-vp7xh\") pod \"redhat-marketplace-zzhsz\" (UID: \"8522aab6-de35-499c-bfe1-55ff5c72fbc6\") " pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.584025 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8522aab6-de35-499c-bfe1-55ff5c72fbc6-catalog-content\") pod \"redhat-marketplace-zzhsz\" (UID: \"8522aab6-de35-499c-bfe1-55ff5c72fbc6\") " pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.584941 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8522aab6-de35-499c-bfe1-55ff5c72fbc6-utilities\") pod \"redhat-marketplace-zzhsz\" (UID: \"8522aab6-de35-499c-bfe1-55ff5c72fbc6\") " pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.585166 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8522aab6-de35-499c-bfe1-55ff5c72fbc6-catalog-content\") pod \"redhat-marketplace-zzhsz\" (UID: \"8522aab6-de35-499c-bfe1-55ff5c72fbc6\") " pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.612941 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp7xh\" (UniqueName: \"kubernetes.io/projected/8522aab6-de35-499c-bfe1-55ff5c72fbc6-kube-api-access-vp7xh\") pod \"redhat-marketplace-zzhsz\" (UID: \"8522aab6-de35-499c-bfe1-55ff5c72fbc6\") " pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.627867 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fv5bd" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.708090 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.895898 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzhsz"] Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.971471 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pvrk8"] Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.973219 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.979225 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.982053 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pvrk8"] Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.991991 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/310a0fb9-d1a0-42e4-bf28-242a240c788b-catalog-content\") pod \"certified-operators-pvrk8\" (UID: \"310a0fb9-d1a0-42e4-bf28-242a240c788b\") " pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.992063 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/310a0fb9-d1a0-42e4-bf28-242a240c788b-utilities\") pod \"certified-operators-pvrk8\" (UID: \"310a0fb9-d1a0-42e4-bf28-242a240c788b\") " pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:41 crc kubenswrapper[4918]: I0319 16:45:41.992090 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgfdd\" (UniqueName: \"kubernetes.io/projected/310a0fb9-d1a0-42e4-bf28-242a240c788b-kube-api-access-xgfdd\") pod \"certified-operators-pvrk8\" (UID: \"310a0fb9-d1a0-42e4-bf28-242a240c788b\") " pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.092981 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgfdd\" (UniqueName: \"kubernetes.io/projected/310a0fb9-d1a0-42e4-bf28-242a240c788b-kube-api-access-xgfdd\") pod \"certified-operators-pvrk8\" (UID: \"310a0fb9-d1a0-42e4-bf28-242a240c788b\") " pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.093087 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/310a0fb9-d1a0-42e4-bf28-242a240c788b-catalog-content\") pod \"certified-operators-pvrk8\" (UID: \"310a0fb9-d1a0-42e4-bf28-242a240c788b\") " pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.093123 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/310a0fb9-d1a0-42e4-bf28-242a240c788b-utilities\") pod \"certified-operators-pvrk8\" (UID: \"310a0fb9-d1a0-42e4-bf28-242a240c788b\") " pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.093599 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/310a0fb9-d1a0-42e4-bf28-242a240c788b-utilities\") pod \"certified-operators-pvrk8\" (UID: \"310a0fb9-d1a0-42e4-bf28-242a240c788b\") " pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.095156 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/310a0fb9-d1a0-42e4-bf28-242a240c788b-catalog-content\") pod \"certified-operators-pvrk8\" (UID: \"310a0fb9-d1a0-42e4-bf28-242a240c788b\") " pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.112901 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgfdd\" (UniqueName: \"kubernetes.io/projected/310a0fb9-d1a0-42e4-bf28-242a240c788b-kube-api-access-xgfdd\") pod \"certified-operators-pvrk8\" (UID: \"310a0fb9-d1a0-42e4-bf28-242a240c788b\") " pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.314996 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.595613 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9fdbd8-ec5d-4aa7-8097-a081455a27fa" path="/var/lib/kubelet/pods/3a9fdbd8-ec5d-4aa7-8097-a081455a27fa/volumes" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.597807 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b9142f-4eaf-41a0-9b13-dae083686eec" path="/var/lib/kubelet/pods/66b9142f-4eaf-41a0-9b13-dae083686eec/volumes" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.599681 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8824182c-653f-4719-87ac-38d3c9c44f12" path="/var/lib/kubelet/pods/8824182c-653f-4719-87ac-38d3c9c44f12/volumes" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.601263 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4" path="/var/lib/kubelet/pods/ba3b6b2d-3e37-4f1c-95f8-557fc4b379c4/volumes" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.601917 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5b832a4-7fec-4d2f-a400-1e890bb551b4" path="/var/lib/kubelet/pods/f5b832a4-7fec-4d2f-a400-1e890bb551b4/volumes" Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.642582 4918 generic.go:334] "Generic (PLEG): container finished" podID="8522aab6-de35-499c-bfe1-55ff5c72fbc6" containerID="a86a3f457ad935a6ef35193da702a00f1b152b025d6663b14fdddd941fb38fef" exitCode=0 Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.642722 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzhsz" event={"ID":"8522aab6-de35-499c-bfe1-55ff5c72fbc6","Type":"ContainerDied","Data":"a86a3f457ad935a6ef35193da702a00f1b152b025d6663b14fdddd941fb38fef"} Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.642770 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzhsz" event={"ID":"8522aab6-de35-499c-bfe1-55ff5c72fbc6","Type":"ContainerStarted","Data":"65acdb8c27a26dac3054378a9636218b2cf4d5def9dd175ad162439634571375"} Mar 19 16:45:42 crc kubenswrapper[4918]: I0319 16:45:42.768811 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pvrk8"] Mar 19 16:45:42 crc kubenswrapper[4918]: W0319 16:45:42.775602 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod310a0fb9_d1a0_42e4_bf28_242a240c788b.slice/crio-59d29a73639566620d3414b1f4e2765551c918af0d9279b4988eabbd9c0442e1 WatchSource:0}: Error finding container 59d29a73639566620d3414b1f4e2765551c918af0d9279b4988eabbd9c0442e1: Status 404 returned error can't find the container with id 59d29a73639566620d3414b1f4e2765551c918af0d9279b4988eabbd9c0442e1 Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.653324 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pvrk8" event={"ID":"310a0fb9-d1a0-42e4-bf28-242a240c788b","Type":"ContainerStarted","Data":"5e0093a5fd7ebae3a22c2714221fcd1ed53045996c5c2af0a244dd3c0628d703"} Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.653588 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pvrk8" event={"ID":"310a0fb9-d1a0-42e4-bf28-242a240c788b","Type":"ContainerStarted","Data":"59d29a73639566620d3414b1f4e2765551c918af0d9279b4988eabbd9c0442e1"} Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.774488 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lmkmp"] Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.775426 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.777555 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.796767 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmkmp"] Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.818576 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a1a963-759d-404c-8e73-0df8c5a73a59-utilities\") pod \"redhat-operators-lmkmp\" (UID: \"30a1a963-759d-404c-8e73-0df8c5a73a59\") " pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.818698 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-488db\" (UniqueName: \"kubernetes.io/projected/30a1a963-759d-404c-8e73-0df8c5a73a59-kube-api-access-488db\") pod \"redhat-operators-lmkmp\" (UID: \"30a1a963-759d-404c-8e73-0df8c5a73a59\") " pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.818736 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a1a963-759d-404c-8e73-0df8c5a73a59-catalog-content\") pod \"redhat-operators-lmkmp\" (UID: \"30a1a963-759d-404c-8e73-0df8c5a73a59\") " pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.920162 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a1a963-759d-404c-8e73-0df8c5a73a59-utilities\") pod \"redhat-operators-lmkmp\" (UID: \"30a1a963-759d-404c-8e73-0df8c5a73a59\") " pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.920279 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-488db\" (UniqueName: \"kubernetes.io/projected/30a1a963-759d-404c-8e73-0df8c5a73a59-kube-api-access-488db\") pod \"redhat-operators-lmkmp\" (UID: \"30a1a963-759d-404c-8e73-0df8c5a73a59\") " pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.920314 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a1a963-759d-404c-8e73-0df8c5a73a59-catalog-content\") pod \"redhat-operators-lmkmp\" (UID: \"30a1a963-759d-404c-8e73-0df8c5a73a59\") " pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.921011 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a1a963-759d-404c-8e73-0df8c5a73a59-catalog-content\") pod \"redhat-operators-lmkmp\" (UID: \"30a1a963-759d-404c-8e73-0df8c5a73a59\") " pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.921292 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a1a963-759d-404c-8e73-0df8c5a73a59-utilities\") pod \"redhat-operators-lmkmp\" (UID: \"30a1a963-759d-404c-8e73-0df8c5a73a59\") " pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:43 crc kubenswrapper[4918]: I0319 16:45:43.953216 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-488db\" (UniqueName: \"kubernetes.io/projected/30a1a963-759d-404c-8e73-0df8c5a73a59-kube-api-access-488db\") pod \"redhat-operators-lmkmp\" (UID: \"30a1a963-759d-404c-8e73-0df8c5a73a59\") " pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.112215 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.376195 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z2gzl"] Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.377471 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.379641 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.399310 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z2gzl"] Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.427803 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trhpj\" (UniqueName: \"kubernetes.io/projected/37f55ac3-175f-483c-83f7-125e90cee899-kube-api-access-trhpj\") pod \"community-operators-z2gzl\" (UID: \"37f55ac3-175f-483c-83f7-125e90cee899\") " pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.427849 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f55ac3-175f-483c-83f7-125e90cee899-catalog-content\") pod \"community-operators-z2gzl\" (UID: \"37f55ac3-175f-483c-83f7-125e90cee899\") " pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.427885 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f55ac3-175f-483c-83f7-125e90cee899-utilities\") pod \"community-operators-z2gzl\" (UID: \"37f55ac3-175f-483c-83f7-125e90cee899\") " pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.444884 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmkmp"] Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.529416 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f55ac3-175f-483c-83f7-125e90cee899-utilities\") pod \"community-operators-z2gzl\" (UID: \"37f55ac3-175f-483c-83f7-125e90cee899\") " pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.529508 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trhpj\" (UniqueName: \"kubernetes.io/projected/37f55ac3-175f-483c-83f7-125e90cee899-kube-api-access-trhpj\") pod \"community-operators-z2gzl\" (UID: \"37f55ac3-175f-483c-83f7-125e90cee899\") " pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.529548 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f55ac3-175f-483c-83f7-125e90cee899-catalog-content\") pod \"community-operators-z2gzl\" (UID: \"37f55ac3-175f-483c-83f7-125e90cee899\") " pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.530388 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f55ac3-175f-483c-83f7-125e90cee899-catalog-content\") pod \"community-operators-z2gzl\" (UID: \"37f55ac3-175f-483c-83f7-125e90cee899\") " pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.530563 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f55ac3-175f-483c-83f7-125e90cee899-utilities\") pod \"community-operators-z2gzl\" (UID: \"37f55ac3-175f-483c-83f7-125e90cee899\") " pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.549164 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trhpj\" (UniqueName: \"kubernetes.io/projected/37f55ac3-175f-483c-83f7-125e90cee899-kube-api-access-trhpj\") pod \"community-operators-z2gzl\" (UID: \"37f55ac3-175f-483c-83f7-125e90cee899\") " pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.660461 4918 generic.go:334] "Generic (PLEG): container finished" podID="310a0fb9-d1a0-42e4-bf28-242a240c788b" containerID="5e0093a5fd7ebae3a22c2714221fcd1ed53045996c5c2af0a244dd3c0628d703" exitCode=0 Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.660510 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pvrk8" event={"ID":"310a0fb9-d1a0-42e4-bf28-242a240c788b","Type":"ContainerDied","Data":"5e0093a5fd7ebae3a22c2714221fcd1ed53045996c5c2af0a244dd3c0628d703"} Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.663938 4918 generic.go:334] "Generic (PLEG): container finished" podID="8522aab6-de35-499c-bfe1-55ff5c72fbc6" containerID="9a7c7ff0350237263a521c8d7c410c97f62550cafff135580f019c62f13fb4be" exitCode=0 Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.664003 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzhsz" event={"ID":"8522aab6-de35-499c-bfe1-55ff5c72fbc6","Type":"ContainerDied","Data":"9a7c7ff0350237263a521c8d7c410c97f62550cafff135580f019c62f13fb4be"} Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.666290 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmkmp" event={"ID":"30a1a963-759d-404c-8e73-0df8c5a73a59","Type":"ContainerStarted","Data":"21612de402ba7f45f14b24c8f9e61e2e0caf9929a6c81967109c8c41457744af"} Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.697021 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:44 crc kubenswrapper[4918]: I0319 16:45:44.901981 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z2gzl"] Mar 19 16:45:44 crc kubenswrapper[4918]: W0319 16:45:44.913793 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37f55ac3_175f_483c_83f7_125e90cee899.slice/crio-19cc6313c73365e3b6f40dffeffc3cab41721d8c007ea45093bc61469d0db5e9 WatchSource:0}: Error finding container 19cc6313c73365e3b6f40dffeffc3cab41721d8c007ea45093bc61469d0db5e9: Status 404 returned error can't find the container with id 19cc6313c73365e3b6f40dffeffc3cab41721d8c007ea45093bc61469d0db5e9 Mar 19 16:45:45 crc kubenswrapper[4918]: I0319 16:45:45.673367 4918 generic.go:334] "Generic (PLEG): container finished" podID="310a0fb9-d1a0-42e4-bf28-242a240c788b" containerID="00f4a77c98b9c8f8b51ab489044f34eff5f440a271572905b69b3d6c4a019b47" exitCode=0 Mar 19 16:45:45 crc kubenswrapper[4918]: I0319 16:45:45.673428 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pvrk8" event={"ID":"310a0fb9-d1a0-42e4-bf28-242a240c788b","Type":"ContainerDied","Data":"00f4a77c98b9c8f8b51ab489044f34eff5f440a271572905b69b3d6c4a019b47"} Mar 19 16:45:45 crc kubenswrapper[4918]: I0319 16:45:45.677121 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzhsz" event={"ID":"8522aab6-de35-499c-bfe1-55ff5c72fbc6","Type":"ContainerStarted","Data":"c978ac7a4b3698bec25df708600482c2bd9b266373a86a5e85a347708daf687d"} Mar 19 16:45:45 crc kubenswrapper[4918]: I0319 16:45:45.681972 4918 generic.go:334] "Generic (PLEG): container finished" podID="30a1a963-759d-404c-8e73-0df8c5a73a59" containerID="7eff9d332a4351c2bfbba86175ac4f193c63c7467b8b3d03bd33ad5916ae6550" exitCode=0 Mar 19 16:45:45 crc kubenswrapper[4918]: I0319 16:45:45.682048 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmkmp" event={"ID":"30a1a963-759d-404c-8e73-0df8c5a73a59","Type":"ContainerDied","Data":"7eff9d332a4351c2bfbba86175ac4f193c63c7467b8b3d03bd33ad5916ae6550"} Mar 19 16:45:45 crc kubenswrapper[4918]: I0319 16:45:45.685041 4918 generic.go:334] "Generic (PLEG): container finished" podID="37f55ac3-175f-483c-83f7-125e90cee899" containerID="69e593a4670b0b75304eeea9101f855ff18a4dbca6a9d3a8d8a49108194731fc" exitCode=0 Mar 19 16:45:45 crc kubenswrapper[4918]: I0319 16:45:45.685077 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2gzl" event={"ID":"37f55ac3-175f-483c-83f7-125e90cee899","Type":"ContainerDied","Data":"69e593a4670b0b75304eeea9101f855ff18a4dbca6a9d3a8d8a49108194731fc"} Mar 19 16:45:45 crc kubenswrapper[4918]: I0319 16:45:45.685097 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2gzl" event={"ID":"37f55ac3-175f-483c-83f7-125e90cee899","Type":"ContainerStarted","Data":"19cc6313c73365e3b6f40dffeffc3cab41721d8c007ea45093bc61469d0db5e9"} Mar 19 16:45:45 crc kubenswrapper[4918]: I0319 16:45:45.712773 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zzhsz" podStartSLOduration=2.127395265 podStartE2EDuration="4.712757583s" podCreationTimestamp="2026-03-19 16:45:41 +0000 UTC" firstStartedPulling="2026-03-19 16:45:42.648837692 +0000 UTC m=+354.771036940" lastFinishedPulling="2026-03-19 16:45:45.23419997 +0000 UTC m=+357.356399258" observedRunningTime="2026-03-19 16:45:45.712231399 +0000 UTC m=+357.834430657" watchObservedRunningTime="2026-03-19 16:45:45.712757583 +0000 UTC m=+357.834956831" Mar 19 16:45:46 crc kubenswrapper[4918]: I0319 16:45:46.691928 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pvrk8" event={"ID":"310a0fb9-d1a0-42e4-bf28-242a240c788b","Type":"ContainerStarted","Data":"e8b5fbe6a4db3ee87b57d42587beea8890e093dde9d9a099dfc44386adabd1b3"} Mar 19 16:45:46 crc kubenswrapper[4918]: I0319 16:45:46.694960 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2gzl" event={"ID":"37f55ac3-175f-483c-83f7-125e90cee899","Type":"ContainerStarted","Data":"78f231d3ac951273bd23892bfb0edd189b6acde218dc55cf92f613da61ddf9cd"} Mar 19 16:45:46 crc kubenswrapper[4918]: I0319 16:45:46.714152 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pvrk8" podStartSLOduration=4.019827464 podStartE2EDuration="5.714132775s" podCreationTimestamp="2026-03-19 16:45:41 +0000 UTC" firstStartedPulling="2026-03-19 16:45:44.662389315 +0000 UTC m=+356.784588563" lastFinishedPulling="2026-03-19 16:45:46.356694626 +0000 UTC m=+358.478893874" observedRunningTime="2026-03-19 16:45:46.710059032 +0000 UTC m=+358.832258290" watchObservedRunningTime="2026-03-19 16:45:46.714132775 +0000 UTC m=+358.836332023" Mar 19 16:45:47 crc kubenswrapper[4918]: I0319 16:45:47.701631 4918 generic.go:334] "Generic (PLEG): container finished" podID="37f55ac3-175f-483c-83f7-125e90cee899" containerID="78f231d3ac951273bd23892bfb0edd189b6acde218dc55cf92f613da61ddf9cd" exitCode=0 Mar 19 16:45:47 crc kubenswrapper[4918]: I0319 16:45:47.701706 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2gzl" event={"ID":"37f55ac3-175f-483c-83f7-125e90cee899","Type":"ContainerDied","Data":"78f231d3ac951273bd23892bfb0edd189b6acde218dc55cf92f613da61ddf9cd"} Mar 19 16:45:47 crc kubenswrapper[4918]: I0319 16:45:47.703940 4918 generic.go:334] "Generic (PLEG): container finished" podID="30a1a963-759d-404c-8e73-0df8c5a73a59" containerID="633f2af81d8c586147803f4dd6d34989a7e1c34ee5de103da5d172e5f3e94b15" exitCode=0 Mar 19 16:45:47 crc kubenswrapper[4918]: I0319 16:45:47.703991 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmkmp" event={"ID":"30a1a963-759d-404c-8e73-0df8c5a73a59","Type":"ContainerDied","Data":"633f2af81d8c586147803f4dd6d34989a7e1c34ee5de103da5d172e5f3e94b15"} Mar 19 16:45:48 crc kubenswrapper[4918]: I0319 16:45:48.722617 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2gzl" event={"ID":"37f55ac3-175f-483c-83f7-125e90cee899","Type":"ContainerStarted","Data":"2ef7959ade3f37d3c53e7c2a36c0e0639f9014ea296b2b798b5060f0d4321a2e"} Mar 19 16:45:48 crc kubenswrapper[4918]: I0319 16:45:48.726654 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmkmp" event={"ID":"30a1a963-759d-404c-8e73-0df8c5a73a59","Type":"ContainerStarted","Data":"adf0d076f86ce5013cac351e51d6db004fac024c52fffe4aee53016bb3a3fa12"} Mar 19 16:45:48 crc kubenswrapper[4918]: I0319 16:45:48.747801 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z2gzl" podStartSLOduration=2.376136815 podStartE2EDuration="4.747778085s" podCreationTimestamp="2026-03-19 16:45:44 +0000 UTC" firstStartedPulling="2026-03-19 16:45:45.686320351 +0000 UTC m=+357.808519589" lastFinishedPulling="2026-03-19 16:45:48.057961611 +0000 UTC m=+360.180160859" observedRunningTime="2026-03-19 16:45:48.746764876 +0000 UTC m=+360.868964134" watchObservedRunningTime="2026-03-19 16:45:48.747778085 +0000 UTC m=+360.869977373" Mar 19 16:45:48 crc kubenswrapper[4918]: I0319 16:45:48.779726 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lmkmp" podStartSLOduration=3.345791994 podStartE2EDuration="5.779706538s" podCreationTimestamp="2026-03-19 16:45:43 +0000 UTC" firstStartedPulling="2026-03-19 16:45:45.684350117 +0000 UTC m=+357.806549365" lastFinishedPulling="2026-03-19 16:45:48.118264661 +0000 UTC m=+360.240463909" observedRunningTime="2026-03-19 16:45:48.774565166 +0000 UTC m=+360.896764434" watchObservedRunningTime="2026-03-19 16:45:48.779706538 +0000 UTC m=+360.901905806" Mar 19 16:45:51 crc kubenswrapper[4918]: I0319 16:45:51.709403 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:51 crc kubenswrapper[4918]: I0319 16:45:51.710381 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:51 crc kubenswrapper[4918]: I0319 16:45:51.780931 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:52 crc kubenswrapper[4918]: I0319 16:45:52.315207 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:52 crc kubenswrapper[4918]: I0319 16:45:52.315296 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:52 crc kubenswrapper[4918]: I0319 16:45:52.363116 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:52 crc kubenswrapper[4918]: I0319 16:45:52.809989 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zzhsz" Mar 19 16:45:52 crc kubenswrapper[4918]: I0319 16:45:52.831655 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pvrk8" Mar 19 16:45:54 crc kubenswrapper[4918]: I0319 16:45:54.112508 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:54 crc kubenswrapper[4918]: I0319 16:45:54.112601 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:45:54 crc kubenswrapper[4918]: I0319 16:45:54.698064 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:54 crc kubenswrapper[4918]: I0319 16:45:54.698415 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:54 crc kubenswrapper[4918]: I0319 16:45:54.746468 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:54 crc kubenswrapper[4918]: I0319 16:45:54.814577 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z2gzl" Mar 19 16:45:55 crc kubenswrapper[4918]: I0319 16:45:55.164565 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmkmp" podUID="30a1a963-759d-404c-8e73-0df8c5a73a59" containerName="registry-server" probeResult="failure" output=< Mar 19 16:45:55 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 16:45:55 crc kubenswrapper[4918]: > Mar 19 16:45:57 crc kubenswrapper[4918]: I0319 16:45:57.900152 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-x7x85" Mar 19 16:45:57 crc kubenswrapper[4918]: I0319 16:45:57.953766 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-khrx9"] Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.132163 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565646-fxttn"] Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.133625 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565646-fxttn" Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.141858 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.141994 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.142074 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.144512 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565646-fxttn"] Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.280253 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkm6t\" (UniqueName: \"kubernetes.io/projected/a6426340-a102-45d1-b1d8-0b347430c764-kube-api-access-wkm6t\") pod \"auto-csr-approver-29565646-fxttn\" (UID: \"a6426340-a102-45d1-b1d8-0b347430c764\") " pod="openshift-infra/auto-csr-approver-29565646-fxttn" Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.381299 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkm6t\" (UniqueName: \"kubernetes.io/projected/a6426340-a102-45d1-b1d8-0b347430c764-kube-api-access-wkm6t\") pod \"auto-csr-approver-29565646-fxttn\" (UID: \"a6426340-a102-45d1-b1d8-0b347430c764\") " pod="openshift-infra/auto-csr-approver-29565646-fxttn" Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.412331 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkm6t\" (UniqueName: \"kubernetes.io/projected/a6426340-a102-45d1-b1d8-0b347430c764-kube-api-access-wkm6t\") pod \"auto-csr-approver-29565646-fxttn\" (UID: \"a6426340-a102-45d1-b1d8-0b347430c764\") " pod="openshift-infra/auto-csr-approver-29565646-fxttn" Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.454261 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565646-fxttn" Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.728740 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565646-fxttn"] Mar 19 16:46:00 crc kubenswrapper[4918]: I0319 16:46:00.812992 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565646-fxttn" event={"ID":"a6426340-a102-45d1-b1d8-0b347430c764","Type":"ContainerStarted","Data":"d81c92288102e50572525a9bcde3b621b58d0f8445270a1a6ae0a1e02d8f7a41"} Mar 19 16:46:02 crc kubenswrapper[4918]: I0319 16:46:02.827125 4918 generic.go:334] "Generic (PLEG): container finished" podID="a6426340-a102-45d1-b1d8-0b347430c764" containerID="2333a985f057bcdb5f4dc1c789cdd72e1fbd69cd342e6ba9b32b4c0342e5d040" exitCode=0 Mar 19 16:46:02 crc kubenswrapper[4918]: I0319 16:46:02.827241 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565646-fxttn" event={"ID":"a6426340-a102-45d1-b1d8-0b347430c764","Type":"ContainerDied","Data":"2333a985f057bcdb5f4dc1c789cdd72e1fbd69cd342e6ba9b32b4c0342e5d040"} Mar 19 16:46:04 crc kubenswrapper[4918]: I0319 16:46:04.156164 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565646-fxttn" Mar 19 16:46:04 crc kubenswrapper[4918]: I0319 16:46:04.193729 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:46:04 crc kubenswrapper[4918]: I0319 16:46:04.238294 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkm6t\" (UniqueName: \"kubernetes.io/projected/a6426340-a102-45d1-b1d8-0b347430c764-kube-api-access-wkm6t\") pod \"a6426340-a102-45d1-b1d8-0b347430c764\" (UID: \"a6426340-a102-45d1-b1d8-0b347430c764\") " Mar 19 16:46:04 crc kubenswrapper[4918]: I0319 16:46:04.244413 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lmkmp" Mar 19 16:46:04 crc kubenswrapper[4918]: I0319 16:46:04.247001 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6426340-a102-45d1-b1d8-0b347430c764-kube-api-access-wkm6t" (OuterVolumeSpecName: "kube-api-access-wkm6t") pod "a6426340-a102-45d1-b1d8-0b347430c764" (UID: "a6426340-a102-45d1-b1d8-0b347430c764"). InnerVolumeSpecName "kube-api-access-wkm6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:46:04 crc kubenswrapper[4918]: I0319 16:46:04.340272 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkm6t\" (UniqueName: \"kubernetes.io/projected/a6426340-a102-45d1-b1d8-0b347430c764-kube-api-access-wkm6t\") on node \"crc\" DevicePath \"\"" Mar 19 16:46:04 crc kubenswrapper[4918]: I0319 16:46:04.850661 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565646-fxttn" event={"ID":"a6426340-a102-45d1-b1d8-0b347430c764","Type":"ContainerDied","Data":"d81c92288102e50572525a9bcde3b621b58d0f8445270a1a6ae0a1e02d8f7a41"} Mar 19 16:46:04 crc kubenswrapper[4918]: I0319 16:46:04.850719 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d81c92288102e50572525a9bcde3b621b58d0f8445270a1a6ae0a1e02d8f7a41" Mar 19 16:46:04 crc kubenswrapper[4918]: I0319 16:46:04.850688 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565646-fxttn" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.001258 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" podUID="e9417c6d-34fd-465b-b780-b88ee938f824" containerName="registry" containerID="cri-o://a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8" gracePeriod=30 Mar 19 16:46:23 crc kubenswrapper[4918]: E0319 16:46:23.156160 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9417c6d_34fd_465b_b780_b88ee938f824.slice/crio-a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9417c6d_34fd_465b_b780_b88ee938f824.slice/crio-conmon-a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8.scope\": RecentStats: unable to find data in memory cache]" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.438378 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.610568 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9417c6d-34fd-465b-b780-b88ee938f824-ca-trust-extracted\") pod \"e9417c6d-34fd-465b-b780-b88ee938f824\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.610636 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9417c6d-34fd-465b-b780-b88ee938f824-installation-pull-secrets\") pod \"e9417c6d-34fd-465b-b780-b88ee938f824\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.610693 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-registry-certificates\") pod \"e9417c6d-34fd-465b-b780-b88ee938f824\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.610748 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-registry-tls\") pod \"e9417c6d-34fd-465b-b780-b88ee938f824\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.610806 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-trusted-ca\") pod \"e9417c6d-34fd-465b-b780-b88ee938f824\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.610863 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llzxz\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-kube-api-access-llzxz\") pod \"e9417c6d-34fd-465b-b780-b88ee938f824\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.611157 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e9417c6d-34fd-465b-b780-b88ee938f824\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.611195 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-bound-sa-token\") pod \"e9417c6d-34fd-465b-b780-b88ee938f824\" (UID: \"e9417c6d-34fd-465b-b780-b88ee938f824\") " Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.612857 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e9417c6d-34fd-465b-b780-b88ee938f824" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.613354 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e9417c6d-34fd-465b-b780-b88ee938f824" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.620498 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9417c6d-34fd-465b-b780-b88ee938f824-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e9417c6d-34fd-465b-b780-b88ee938f824" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.620734 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e9417c6d-34fd-465b-b780-b88ee938f824" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.621774 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e9417c6d-34fd-465b-b780-b88ee938f824" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.625607 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e9417c6d-34fd-465b-b780-b88ee938f824" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.626913 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-kube-api-access-llzxz" (OuterVolumeSpecName: "kube-api-access-llzxz") pod "e9417c6d-34fd-465b-b780-b88ee938f824" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824"). InnerVolumeSpecName "kube-api-access-llzxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.635671 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9417c6d-34fd-465b-b780-b88ee938f824-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e9417c6d-34fd-465b-b780-b88ee938f824" (UID: "e9417c6d-34fd-465b-b780-b88ee938f824"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.713395 4918 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9417c6d-34fd-465b-b780-b88ee938f824-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.713887 4918 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9417c6d-34fd-465b-b780-b88ee938f824-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.714059 4918 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.714096 4918 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.714115 4918 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9417c6d-34fd-465b-b780-b88ee938f824-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.714134 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llzxz\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-kube-api-access-llzxz\") on node \"crc\" DevicePath \"\"" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.714151 4918 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9417c6d-34fd-465b-b780-b88ee938f824-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.969133 4918 generic.go:334] "Generic (PLEG): container finished" podID="e9417c6d-34fd-465b-b780-b88ee938f824" containerID="a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8" exitCode=0 Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.969206 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.969249 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" event={"ID":"e9417c6d-34fd-465b-b780-b88ee938f824","Type":"ContainerDied","Data":"a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8"} Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.969824 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-khrx9" event={"ID":"e9417c6d-34fd-465b-b780-b88ee938f824","Type":"ContainerDied","Data":"d5f6e5100f49d9af21f45374ef41a705d684176cd304ef438a32b718dd285378"} Mar 19 16:46:23 crc kubenswrapper[4918]: I0319 16:46:23.969850 4918 scope.go:117] "RemoveContainer" containerID="a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8" Mar 19 16:46:24 crc kubenswrapper[4918]: I0319 16:46:24.001362 4918 scope.go:117] "RemoveContainer" containerID="a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8" Mar 19 16:46:24 crc kubenswrapper[4918]: E0319 16:46:24.001952 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8\": container with ID starting with a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8 not found: ID does not exist" containerID="a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8" Mar 19 16:46:24 crc kubenswrapper[4918]: I0319 16:46:24.004310 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8"} err="failed to get container status \"a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8\": rpc error: code = NotFound desc = could not find container \"a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8\": container with ID starting with a4c65e089a7972ebe6325923ad2eef8005de36fab74abb8a5ce77f9506f3f8a8 not found: ID does not exist" Mar 19 16:46:24 crc kubenswrapper[4918]: I0319 16:46:24.018868 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-khrx9"] Mar 19 16:46:24 crc kubenswrapper[4918]: I0319 16:46:24.031337 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-khrx9"] Mar 19 16:46:24 crc kubenswrapper[4918]: I0319 16:46:24.604313 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9417c6d-34fd-465b-b780-b88ee938f824" path="/var/lib/kubelet/pods/e9417c6d-34fd-465b-b780-b88ee938f824/volumes" Mar 19 16:46:58 crc kubenswrapper[4918]: I0319 16:46:58.212454 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:46:58 crc kubenswrapper[4918]: I0319 16:46:58.213370 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:47:28 crc kubenswrapper[4918]: I0319 16:47:28.211979 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:47:28 crc kubenswrapper[4918]: I0319 16:47:28.212581 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:47:58 crc kubenswrapper[4918]: I0319 16:47:58.212335 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:47:58 crc kubenswrapper[4918]: I0319 16:47:58.213127 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:47:58 crc kubenswrapper[4918]: I0319 16:47:58.213235 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:47:58 crc kubenswrapper[4918]: I0319 16:47:58.214390 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"860bd4886eb08f73670eccdb27b89ffff79615ee22a0d3ebbad5577184aa8a7e"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 16:47:58 crc kubenswrapper[4918]: I0319 16:47:58.214482 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://860bd4886eb08f73670eccdb27b89ffff79615ee22a0d3ebbad5577184aa8a7e" gracePeriod=600 Mar 19 16:47:58 crc kubenswrapper[4918]: I0319 16:47:58.630172 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="860bd4886eb08f73670eccdb27b89ffff79615ee22a0d3ebbad5577184aa8a7e" exitCode=0 Mar 19 16:47:58 crc kubenswrapper[4918]: I0319 16:47:58.630259 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"860bd4886eb08f73670eccdb27b89ffff79615ee22a0d3ebbad5577184aa8a7e"} Mar 19 16:47:58 crc kubenswrapper[4918]: I0319 16:47:58.630537 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"c18ec4217b3a3ee71c0f992036fec96e3dee59c09db0e6a1b2184c49948e8e26"} Mar 19 16:47:58 crc kubenswrapper[4918]: I0319 16:47:58.630566 4918 scope.go:117] "RemoveContainer" containerID="4d840761dfd614dd8a3c1473b1242185e860c0c959af7cb2cf7f9c58ed3dceb0" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.131063 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565648-6jrtk"] Mar 19 16:48:00 crc kubenswrapper[4918]: E0319 16:48:00.131699 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6426340-a102-45d1-b1d8-0b347430c764" containerName="oc" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.131717 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6426340-a102-45d1-b1d8-0b347430c764" containerName="oc" Mar 19 16:48:00 crc kubenswrapper[4918]: E0319 16:48:00.131729 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9417c6d-34fd-465b-b780-b88ee938f824" containerName="registry" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.131736 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9417c6d-34fd-465b-b780-b88ee938f824" containerName="registry" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.131842 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6426340-a102-45d1-b1d8-0b347430c764" containerName="oc" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.131860 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9417c6d-34fd-465b-b780-b88ee938f824" containerName="registry" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.132305 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565648-6jrtk" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.136111 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.136112 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.136188 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.143481 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqkw7\" (UniqueName: \"kubernetes.io/projected/dcff5a82-0812-414f-8918-7db313699c5e-kube-api-access-kqkw7\") pod \"auto-csr-approver-29565648-6jrtk\" (UID: \"dcff5a82-0812-414f-8918-7db313699c5e\") " pod="openshift-infra/auto-csr-approver-29565648-6jrtk" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.152407 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565648-6jrtk"] Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.244871 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqkw7\" (UniqueName: \"kubernetes.io/projected/dcff5a82-0812-414f-8918-7db313699c5e-kube-api-access-kqkw7\") pod \"auto-csr-approver-29565648-6jrtk\" (UID: \"dcff5a82-0812-414f-8918-7db313699c5e\") " pod="openshift-infra/auto-csr-approver-29565648-6jrtk" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.267774 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqkw7\" (UniqueName: \"kubernetes.io/projected/dcff5a82-0812-414f-8918-7db313699c5e-kube-api-access-kqkw7\") pod \"auto-csr-approver-29565648-6jrtk\" (UID: \"dcff5a82-0812-414f-8918-7db313699c5e\") " pod="openshift-infra/auto-csr-approver-29565648-6jrtk" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.460136 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565648-6jrtk" Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.735737 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565648-6jrtk"] Mar 19 16:48:00 crc kubenswrapper[4918]: W0319 16:48:00.739922 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcff5a82_0812_414f_8918_7db313699c5e.slice/crio-8c49fe601e542f985f97ede49d8c0605e25bc5ef06ddfc6664f60d38ef9ce2f9 WatchSource:0}: Error finding container 8c49fe601e542f985f97ede49d8c0605e25bc5ef06ddfc6664f60d38ef9ce2f9: Status 404 returned error can't find the container with id 8c49fe601e542f985f97ede49d8c0605e25bc5ef06ddfc6664f60d38ef9ce2f9 Mar 19 16:48:00 crc kubenswrapper[4918]: I0319 16:48:00.743855 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 16:48:01 crc kubenswrapper[4918]: I0319 16:48:01.655899 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565648-6jrtk" event={"ID":"dcff5a82-0812-414f-8918-7db313699c5e","Type":"ContainerStarted","Data":"8c49fe601e542f985f97ede49d8c0605e25bc5ef06ddfc6664f60d38ef9ce2f9"} Mar 19 16:48:02 crc kubenswrapper[4918]: I0319 16:48:02.665006 4918 generic.go:334] "Generic (PLEG): container finished" podID="dcff5a82-0812-414f-8918-7db313699c5e" containerID="e7e5d8d2f12c8db1eb7a2642acd884b688a04a15efb6c098866fb480ba2c15cd" exitCode=0 Mar 19 16:48:02 crc kubenswrapper[4918]: I0319 16:48:02.665085 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565648-6jrtk" event={"ID":"dcff5a82-0812-414f-8918-7db313699c5e","Type":"ContainerDied","Data":"e7e5d8d2f12c8db1eb7a2642acd884b688a04a15efb6c098866fb480ba2c15cd"} Mar 19 16:48:04 crc kubenswrapper[4918]: I0319 16:48:04.043363 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565648-6jrtk" Mar 19 16:48:04 crc kubenswrapper[4918]: I0319 16:48:04.198507 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqkw7\" (UniqueName: \"kubernetes.io/projected/dcff5a82-0812-414f-8918-7db313699c5e-kube-api-access-kqkw7\") pod \"dcff5a82-0812-414f-8918-7db313699c5e\" (UID: \"dcff5a82-0812-414f-8918-7db313699c5e\") " Mar 19 16:48:04 crc kubenswrapper[4918]: I0319 16:48:04.205825 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcff5a82-0812-414f-8918-7db313699c5e-kube-api-access-kqkw7" (OuterVolumeSpecName: "kube-api-access-kqkw7") pod "dcff5a82-0812-414f-8918-7db313699c5e" (UID: "dcff5a82-0812-414f-8918-7db313699c5e"). InnerVolumeSpecName "kube-api-access-kqkw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:48:04 crc kubenswrapper[4918]: I0319 16:48:04.300211 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqkw7\" (UniqueName: \"kubernetes.io/projected/dcff5a82-0812-414f-8918-7db313699c5e-kube-api-access-kqkw7\") on node \"crc\" DevicePath \"\"" Mar 19 16:48:04 crc kubenswrapper[4918]: I0319 16:48:04.680910 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565648-6jrtk" event={"ID":"dcff5a82-0812-414f-8918-7db313699c5e","Type":"ContainerDied","Data":"8c49fe601e542f985f97ede49d8c0605e25bc5ef06ddfc6664f60d38ef9ce2f9"} Mar 19 16:48:04 crc kubenswrapper[4918]: I0319 16:48:04.680953 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c49fe601e542f985f97ede49d8c0605e25bc5ef06ddfc6664f60d38ef9ce2f9" Mar 19 16:48:04 crc kubenswrapper[4918]: I0319 16:48:04.680996 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565648-6jrtk" Mar 19 16:48:04 crc kubenswrapper[4918]: E0319 16:48:04.729979 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcff5a82_0812_414f_8918_7db313699c5e.slice\": RecentStats: unable to find data in memory cache]" Mar 19 16:48:05 crc kubenswrapper[4918]: I0319 16:48:05.122715 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565642-wjgcp"] Mar 19 16:48:05 crc kubenswrapper[4918]: I0319 16:48:05.131644 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565642-wjgcp"] Mar 19 16:48:06 crc kubenswrapper[4918]: I0319 16:48:06.599668 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490c710f-78b8-41a4-b4bc-4eeffdde7a5d" path="/var/lib/kubelet/pods/490c710f-78b8-41a4-b4bc-4eeffdde7a5d/volumes" Mar 19 16:48:49 crc kubenswrapper[4918]: I0319 16:48:49.382346 4918 scope.go:117] "RemoveContainer" containerID="e9fe3eb56d309db503acd4817def0be0efeb8d8d1a1463333f4a86bc82b3d90c" Mar 19 16:48:49 crc kubenswrapper[4918]: I0319 16:48:49.409604 4918 scope.go:117] "RemoveContainer" containerID="f2e796adf0785bd5590b7388cfa305a7840be9acc6d030eec8f3958bfb276e6f" Mar 19 16:48:49 crc kubenswrapper[4918]: I0319 16:48:49.433924 4918 scope.go:117] "RemoveContainer" containerID="c87f4c1206fb07b4f1360b38faf71bb75a719a809aa3384559ea7e1d6b46d707" Mar 19 16:49:49 crc kubenswrapper[4918]: I0319 16:49:49.492642 4918 scope.go:117] "RemoveContainer" containerID="b9476c8c22fb672b08a1def9b9d4fd0d8e6e3a1361958274a31c95abdd29b83a" Mar 19 16:49:58 crc kubenswrapper[4918]: I0319 16:49:58.212257 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:49:58 crc kubenswrapper[4918]: I0319 16:49:58.212970 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.150884 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565650-qv7bd"] Mar 19 16:50:00 crc kubenswrapper[4918]: E0319 16:50:00.151246 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcff5a82-0812-414f-8918-7db313699c5e" containerName="oc" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.151270 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcff5a82-0812-414f-8918-7db313699c5e" containerName="oc" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.151435 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcff5a82-0812-414f-8918-7db313699c5e" containerName="oc" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.152074 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565650-qv7bd" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.155316 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.156343 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.156721 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.174022 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565650-qv7bd"] Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.197023 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qrsf\" (UniqueName: \"kubernetes.io/projected/ec881118-0ccc-40c5-aecd-4e1fb8691288-kube-api-access-7qrsf\") pod \"auto-csr-approver-29565650-qv7bd\" (UID: \"ec881118-0ccc-40c5-aecd-4e1fb8691288\") " pod="openshift-infra/auto-csr-approver-29565650-qv7bd" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.299398 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qrsf\" (UniqueName: \"kubernetes.io/projected/ec881118-0ccc-40c5-aecd-4e1fb8691288-kube-api-access-7qrsf\") pod \"auto-csr-approver-29565650-qv7bd\" (UID: \"ec881118-0ccc-40c5-aecd-4e1fb8691288\") " pod="openshift-infra/auto-csr-approver-29565650-qv7bd" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.324613 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qrsf\" (UniqueName: \"kubernetes.io/projected/ec881118-0ccc-40c5-aecd-4e1fb8691288-kube-api-access-7qrsf\") pod \"auto-csr-approver-29565650-qv7bd\" (UID: \"ec881118-0ccc-40c5-aecd-4e1fb8691288\") " pod="openshift-infra/auto-csr-approver-29565650-qv7bd" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.480735 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565650-qv7bd" Mar 19 16:50:00 crc kubenswrapper[4918]: I0319 16:50:00.734344 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565650-qv7bd"] Mar 19 16:50:01 crc kubenswrapper[4918]: I0319 16:50:01.484747 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565650-qv7bd" event={"ID":"ec881118-0ccc-40c5-aecd-4e1fb8691288","Type":"ContainerStarted","Data":"b7caf702d10bd276aee49c8b62a9f3faa35440c90fc73b9c4d9bbce97ca19911"} Mar 19 16:50:02 crc kubenswrapper[4918]: I0319 16:50:02.495600 4918 generic.go:334] "Generic (PLEG): container finished" podID="ec881118-0ccc-40c5-aecd-4e1fb8691288" containerID="31a9104c699d65221949aa6a3dddec1ec55176353c8887000abb5b39d8efa122" exitCode=0 Mar 19 16:50:02 crc kubenswrapper[4918]: I0319 16:50:02.495704 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565650-qv7bd" event={"ID":"ec881118-0ccc-40c5-aecd-4e1fb8691288","Type":"ContainerDied","Data":"31a9104c699d65221949aa6a3dddec1ec55176353c8887000abb5b39d8efa122"} Mar 19 16:50:03 crc kubenswrapper[4918]: I0319 16:50:03.751147 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565650-qv7bd" Mar 19 16:50:03 crc kubenswrapper[4918]: I0319 16:50:03.848469 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qrsf\" (UniqueName: \"kubernetes.io/projected/ec881118-0ccc-40c5-aecd-4e1fb8691288-kube-api-access-7qrsf\") pod \"ec881118-0ccc-40c5-aecd-4e1fb8691288\" (UID: \"ec881118-0ccc-40c5-aecd-4e1fb8691288\") " Mar 19 16:50:03 crc kubenswrapper[4918]: I0319 16:50:03.858416 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec881118-0ccc-40c5-aecd-4e1fb8691288-kube-api-access-7qrsf" (OuterVolumeSpecName: "kube-api-access-7qrsf") pod "ec881118-0ccc-40c5-aecd-4e1fb8691288" (UID: "ec881118-0ccc-40c5-aecd-4e1fb8691288"). InnerVolumeSpecName "kube-api-access-7qrsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:50:03 crc kubenswrapper[4918]: I0319 16:50:03.950002 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qrsf\" (UniqueName: \"kubernetes.io/projected/ec881118-0ccc-40c5-aecd-4e1fb8691288-kube-api-access-7qrsf\") on node \"crc\" DevicePath \"\"" Mar 19 16:50:04 crc kubenswrapper[4918]: I0319 16:50:04.515303 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565650-qv7bd" event={"ID":"ec881118-0ccc-40c5-aecd-4e1fb8691288","Type":"ContainerDied","Data":"b7caf702d10bd276aee49c8b62a9f3faa35440c90fc73b9c4d9bbce97ca19911"} Mar 19 16:50:04 crc kubenswrapper[4918]: I0319 16:50:04.515690 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7caf702d10bd276aee49c8b62a9f3faa35440c90fc73b9c4d9bbce97ca19911" Mar 19 16:50:04 crc kubenswrapper[4918]: I0319 16:50:04.515424 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565650-qv7bd" Mar 19 16:50:04 crc kubenswrapper[4918]: I0319 16:50:04.820778 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565644-5c8pr"] Mar 19 16:50:04 crc kubenswrapper[4918]: I0319 16:50:04.826849 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565644-5c8pr"] Mar 19 16:50:06 crc kubenswrapper[4918]: I0319 16:50:06.594377 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f75fcd6-ecca-47f1-84dd-c3521d1f9583" path="/var/lib/kubelet/pods/0f75fcd6-ecca-47f1-84dd-c3521d1f9583/volumes" Mar 19 16:50:28 crc kubenswrapper[4918]: I0319 16:50:28.213950 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:50:28 crc kubenswrapper[4918]: I0319 16:50:28.216234 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:50:58 crc kubenswrapper[4918]: I0319 16:50:58.213280 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:50:58 crc kubenswrapper[4918]: I0319 16:50:58.213836 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:50:58 crc kubenswrapper[4918]: I0319 16:50:58.213901 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:50:58 crc kubenswrapper[4918]: I0319 16:50:58.214621 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c18ec4217b3a3ee71c0f992036fec96e3dee59c09db0e6a1b2184c49948e8e26"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 16:50:58 crc kubenswrapper[4918]: I0319 16:50:58.214697 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://c18ec4217b3a3ee71c0f992036fec96e3dee59c09db0e6a1b2184c49948e8e26" gracePeriod=600 Mar 19 16:50:58 crc kubenswrapper[4918]: I0319 16:50:58.868411 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="c18ec4217b3a3ee71c0f992036fec96e3dee59c09db0e6a1b2184c49948e8e26" exitCode=0 Mar 19 16:50:58 crc kubenswrapper[4918]: I0319 16:50:58.868479 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"c18ec4217b3a3ee71c0f992036fec96e3dee59c09db0e6a1b2184c49948e8e26"} Mar 19 16:50:58 crc kubenswrapper[4918]: I0319 16:50:58.869113 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"565e47777606eec4fea1871e422ec1703bb3c3550c00f538a28da566b1063407"} Mar 19 16:50:58 crc kubenswrapper[4918]: I0319 16:50:58.869162 4918 scope.go:117] "RemoveContainer" containerID="860bd4886eb08f73670eccdb27b89ffff79615ee22a0d3ebbad5577184aa8a7e" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.042916 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9"] Mar 19 16:51:13 crc kubenswrapper[4918]: E0319 16:51:13.043735 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec881118-0ccc-40c5-aecd-4e1fb8691288" containerName="oc" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.043751 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec881118-0ccc-40c5-aecd-4e1fb8691288" containerName="oc" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.043907 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec881118-0ccc-40c5-aecd-4e1fb8691288" containerName="oc" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.044817 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.047460 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.057929 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9"] Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.094630 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csmm6\" (UniqueName: \"kubernetes.io/projected/584f93a6-d456-4bc0-89f1-71eef948d233-kube-api-access-csmm6\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.094808 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.094916 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.195641 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csmm6\" (UniqueName: \"kubernetes.io/projected/584f93a6-d456-4bc0-89f1-71eef948d233-kube-api-access-csmm6\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.195716 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.195761 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.196200 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.196372 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.228776 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csmm6\" (UniqueName: \"kubernetes.io/projected/584f93a6-d456-4bc0-89f1-71eef948d233-kube-api-access-csmm6\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.395662 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.632359 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9"] Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.976874 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" event={"ID":"584f93a6-d456-4bc0-89f1-71eef948d233","Type":"ContainerStarted","Data":"51ff1e902e1c3d916349efdc6cb1557d672df9e015fb0327c4523d4af5953db7"} Mar 19 16:51:13 crc kubenswrapper[4918]: I0319 16:51:13.977221 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" event={"ID":"584f93a6-d456-4bc0-89f1-71eef948d233","Type":"ContainerStarted","Data":"7a886af3389b93f8c7f92e568d30501915c0fbc802f127e604f08575f6ca7125"} Mar 19 16:51:14 crc kubenswrapper[4918]: I0319 16:51:14.984728 4918 generic.go:334] "Generic (PLEG): container finished" podID="584f93a6-d456-4bc0-89f1-71eef948d233" containerID="51ff1e902e1c3d916349efdc6cb1557d672df9e015fb0327c4523d4af5953db7" exitCode=0 Mar 19 16:51:14 crc kubenswrapper[4918]: I0319 16:51:14.984798 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" event={"ID":"584f93a6-d456-4bc0-89f1-71eef948d233","Type":"ContainerDied","Data":"51ff1e902e1c3d916349efdc6cb1557d672df9e015fb0327c4523d4af5953db7"} Mar 19 16:51:21 crc kubenswrapper[4918]: I0319 16:51:21.035186 4918 generic.go:334] "Generic (PLEG): container finished" podID="584f93a6-d456-4bc0-89f1-71eef948d233" containerID="6b51dc8ee93a67bcd43ec2b5d1d07311740cd7dd2db17b22ba06a1ebd34762de" exitCode=0 Mar 19 16:51:21 crc kubenswrapper[4918]: I0319 16:51:21.035270 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" event={"ID":"584f93a6-d456-4bc0-89f1-71eef948d233","Type":"ContainerDied","Data":"6b51dc8ee93a67bcd43ec2b5d1d07311740cd7dd2db17b22ba06a1ebd34762de"} Mar 19 16:51:22 crc kubenswrapper[4918]: I0319 16:51:22.044929 4918 generic.go:334] "Generic (PLEG): container finished" podID="584f93a6-d456-4bc0-89f1-71eef948d233" containerID="ab00e63a3b5a12f9ec1f6a2a3af5e508b71e6082c93398962611a07fd7dd6484" exitCode=0 Mar 19 16:51:22 crc kubenswrapper[4918]: I0319 16:51:22.045041 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" event={"ID":"584f93a6-d456-4bc0-89f1-71eef948d233","Type":"ContainerDied","Data":"ab00e63a3b5a12f9ec1f6a2a3af5e508b71e6082c93398962611a07fd7dd6484"} Mar 19 16:51:23 crc kubenswrapper[4918]: I0319 16:51:23.281000 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:23 crc kubenswrapper[4918]: I0319 16:51:23.342453 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-util\") pod \"584f93a6-d456-4bc0-89f1-71eef948d233\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " Mar 19 16:51:23 crc kubenswrapper[4918]: I0319 16:51:23.342801 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csmm6\" (UniqueName: \"kubernetes.io/projected/584f93a6-d456-4bc0-89f1-71eef948d233-kube-api-access-csmm6\") pod \"584f93a6-d456-4bc0-89f1-71eef948d233\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " Mar 19 16:51:23 crc kubenswrapper[4918]: I0319 16:51:23.342872 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-bundle\") pod \"584f93a6-d456-4bc0-89f1-71eef948d233\" (UID: \"584f93a6-d456-4bc0-89f1-71eef948d233\") " Mar 19 16:51:23 crc kubenswrapper[4918]: I0319 16:51:23.345594 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-bundle" (OuterVolumeSpecName: "bundle") pod "584f93a6-d456-4bc0-89f1-71eef948d233" (UID: "584f93a6-d456-4bc0-89f1-71eef948d233"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:51:23 crc kubenswrapper[4918]: I0319 16:51:23.351508 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/584f93a6-d456-4bc0-89f1-71eef948d233-kube-api-access-csmm6" (OuterVolumeSpecName: "kube-api-access-csmm6") pod "584f93a6-d456-4bc0-89f1-71eef948d233" (UID: "584f93a6-d456-4bc0-89f1-71eef948d233"). InnerVolumeSpecName "kube-api-access-csmm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:51:23 crc kubenswrapper[4918]: I0319 16:51:23.356161 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-util" (OuterVolumeSpecName: "util") pod "584f93a6-d456-4bc0-89f1-71eef948d233" (UID: "584f93a6-d456-4bc0-89f1-71eef948d233"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:51:23 crc kubenswrapper[4918]: I0319 16:51:23.444620 4918 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:23 crc kubenswrapper[4918]: I0319 16:51:23.444676 4918 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/584f93a6-d456-4bc0-89f1-71eef948d233-util\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:23 crc kubenswrapper[4918]: I0319 16:51:23.444704 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csmm6\" (UniqueName: \"kubernetes.io/projected/584f93a6-d456-4bc0-89f1-71eef948d233-kube-api-access-csmm6\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.065640 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" event={"ID":"584f93a6-d456-4bc0-89f1-71eef948d233","Type":"ContainerDied","Data":"7a886af3389b93f8c7f92e568d30501915c0fbc802f127e604f08575f6ca7125"} Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.066065 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a886af3389b93f8c7f92e568d30501915c0fbc802f127e604f08575f6ca7125" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.066329 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.078751 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g7pf8"] Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.079148 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovn-controller" containerID="cri-o://bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3" gracePeriod=30 Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.079219 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="nbdb" containerID="cri-o://320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc" gracePeriod=30 Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.079271 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707" gracePeriod=30 Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.079317 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="kube-rbac-proxy-node" containerID="cri-o://341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6" gracePeriod=30 Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.079350 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovn-acl-logging" containerID="cri-o://579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50" gracePeriod=30 Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.079251 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="northd" containerID="cri-o://8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90" gracePeriod=30 Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.079659 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="sbdb" containerID="cri-o://b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274" gracePeriod=30 Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.145354 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovnkube-controller" containerID="cri-o://e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e" gracePeriod=30 Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.478620 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7pf8_571f6589-a451-476a-9066-d348b85a81ac/ovn-acl-logging/0.log" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.480550 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7pf8_571f6589-a451-476a-9066-d348b85a81ac/ovn-controller/0.log" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.483095 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550056 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-888rm"] Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550364 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550393 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550410 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovnkube-controller" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550423 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovnkube-controller" Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550446 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="kube-rbac-proxy-node" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550458 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="kube-rbac-proxy-node" Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550478 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovn-acl-logging" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550491 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovn-acl-logging" Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550504 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="northd" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550515 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="northd" Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550557 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584f93a6-d456-4bc0-89f1-71eef948d233" containerName="pull" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550568 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="584f93a6-d456-4bc0-89f1-71eef948d233" containerName="pull" Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550586 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="nbdb" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550598 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="nbdb" Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550617 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584f93a6-d456-4bc0-89f1-71eef948d233" containerName="util" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550628 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="584f93a6-d456-4bc0-89f1-71eef948d233" containerName="util" Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550647 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="sbdb" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550659 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="sbdb" Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550673 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovn-controller" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550684 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovn-controller" Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550699 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="kubecfg-setup" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550713 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="kubecfg-setup" Mar 19 16:51:24 crc kubenswrapper[4918]: E0319 16:51:24.550731 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584f93a6-d456-4bc0-89f1-71eef948d233" containerName="extract" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550743 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="584f93a6-d456-4bc0-89f1-71eef948d233" containerName="extract" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550895 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550920 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovn-controller" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550939 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovn-acl-logging" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550956 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="kube-rbac-proxy-node" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550972 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="ovnkube-controller" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.550986 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="nbdb" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.551003 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="584f93a6-d456-4bc0-89f1-71eef948d233" containerName="extract" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.551020 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="northd" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.551034 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="571f6589-a451-476a-9066-d348b85a81ac" containerName="sbdb" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.554259 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.578945 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-node-log\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579000 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-env-overrides\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579045 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l48p7\" (UniqueName: \"kubernetes.io/projected/571f6589-a451-476a-9066-d348b85a81ac-kube-api-access-l48p7\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579068 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-systemd-units\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579106 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-var-lib-openvswitch\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579130 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-ovn-kubernetes\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579152 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-bin\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579172 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-kubelet\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579195 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-ovn\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579223 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-config\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579246 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-etc-openvswitch\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579270 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/571f6589-a451-476a-9066-d348b85a81ac-ovn-node-metrics-cert\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579289 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-netns\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579319 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-slash\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579341 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-script-lib\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579362 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-netd\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579384 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579414 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-openvswitch\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579444 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-systemd\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579480 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-log-socket\") pod \"571f6589-a451-476a-9066-d348b85a81ac\" (UID: \"571f6589-a451-476a-9066-d348b85a81ac\") " Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579754 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-log-socket" (OuterVolumeSpecName: "log-socket") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579801 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579847 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579934 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579986 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-slash" (OuterVolumeSpecName: "host-slash") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.579951 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.580011 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.580040 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.580035 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.580027 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-node-log" (OuterVolumeSpecName: "node-log") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.580082 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.580084 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.580089 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.580038 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.580544 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.580875 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.581483 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.587625 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571f6589-a451-476a-9066-d348b85a81ac-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.588678 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571f6589-a451-476a-9066-d348b85a81ac-kube-api-access-l48p7" (OuterVolumeSpecName: "kube-api-access-l48p7") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "kube-api-access-l48p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.608345 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "571f6589-a451-476a-9066-d348b85a81ac" (UID: "571f6589-a451-476a-9066-d348b85a81ac"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681148 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a443d3c-04bd-4528-a34c-a711ff575bdc-ovn-node-metrics-cert\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681216 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-etc-openvswitch\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681243 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-run-ovn-kubernetes\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681270 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-node-log\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681321 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-cni-bin\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681358 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-kubelet\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681381 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-systemd-units\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681406 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-run-systemd\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681427 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-cni-netd\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681453 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681482 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv9v6\" (UniqueName: \"kubernetes.io/projected/4a443d3c-04bd-4528-a34c-a711ff575bdc-kube-api-access-kv9v6\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681506 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a443d3c-04bd-4528-a34c-a711ff575bdc-ovnkube-script-lib\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681551 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-var-lib-openvswitch\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681579 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-run-openvswitch\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681604 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-run-ovn\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681628 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-slash\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681647 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-log-socket\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681669 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a443d3c-04bd-4528-a34c-a711ff575bdc-env-overrides\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.681693 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a443d3c-04bd-4528-a34c-a711ff575bdc-ovnkube-config\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682369 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-run-netns\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682664 4918 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682705 4918 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682731 4918 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682754 4918 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682778 4918 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682811 4918 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682837 4918 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682861 4918 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/571f6589-a451-476a-9066-d348b85a81ac-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682880 4918 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682898 4918 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-slash\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682920 4918 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682938 4918 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682958 4918 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.682979 4918 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.683000 4918 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.683019 4918 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-log-socket\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.683038 4918 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-node-log\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.683055 4918 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/571f6589-a451-476a-9066-d348b85a81ac-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.683075 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l48p7\" (UniqueName: \"kubernetes.io/projected/571f6589-a451-476a-9066-d348b85a81ac-kube-api-access-l48p7\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.683094 4918 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/571f6589-a451-476a-9066-d348b85a81ac-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784083 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-run-netns\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784149 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a443d3c-04bd-4528-a34c-a711ff575bdc-ovn-node-metrics-cert\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784174 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-etc-openvswitch\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784193 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-run-ovn-kubernetes\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784216 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-node-log\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784239 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-cni-bin\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784267 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-etc-openvswitch\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784270 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-kubelet\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784310 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-kubelet\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784330 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-systemd-units\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784346 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-run-ovn-kubernetes\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784356 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-cni-netd\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784379 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-node-log\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784384 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-run-systemd\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784408 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-cni-bin\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784412 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784436 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-systemd-units\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784440 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv9v6\" (UniqueName: \"kubernetes.io/projected/4a443d3c-04bd-4528-a34c-a711ff575bdc-kube-api-access-kv9v6\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784464 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a443d3c-04bd-4528-a34c-a711ff575bdc-ovnkube-script-lib\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784489 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-var-lib-openvswitch\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784578 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-run-openvswitch\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784613 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-run-ovn\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784637 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-slash\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784659 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-log-socket\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784679 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a443d3c-04bd-4528-a34c-a711ff575bdc-env-overrides\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784705 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a443d3c-04bd-4528-a34c-a711ff575bdc-ovnkube-config\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.785708 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a443d3c-04bd-4528-a34c-a711ff575bdc-ovnkube-config\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.785775 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-var-lib-openvswitch\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.785821 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-run-openvswitch\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.785807 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-cni-netd\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.785859 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-run-ovn\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.785912 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-log-socket\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.785931 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-slash\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.786012 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-run-systemd\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.786088 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.784235 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a443d3c-04bd-4528-a34c-a711ff575bdc-host-run-netns\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.786470 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a443d3c-04bd-4528-a34c-a711ff575bdc-env-overrides\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.787332 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a443d3c-04bd-4528-a34c-a711ff575bdc-ovnkube-script-lib\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.791733 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a443d3c-04bd-4528-a34c-a711ff575bdc-ovn-node-metrics-cert\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.817210 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv9v6\" (UniqueName: \"kubernetes.io/projected/4a443d3c-04bd-4528-a34c-a711ff575bdc-kube-api-access-kv9v6\") pod \"ovnkube-node-888rm\" (UID: \"4a443d3c-04bd-4528-a34c-a711ff575bdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:24 crc kubenswrapper[4918]: I0319 16:51:24.875922 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.072747 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m2sxj_c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3/kube-multus/0.log" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.072966 4918 generic.go:334] "Generic (PLEG): container finished" podID="c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3" containerID="782db47f818546c7d6da4119af00b714e772fafbd3679b80801b98111cab00b0" exitCode=2 Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.073063 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m2sxj" event={"ID":"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3","Type":"ContainerDied","Data":"782db47f818546c7d6da4119af00b714e772fafbd3679b80801b98111cab00b0"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.074188 4918 scope.go:117] "RemoveContainer" containerID="782db47f818546c7d6da4119af00b714e772fafbd3679b80801b98111cab00b0" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.083437 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7pf8_571f6589-a451-476a-9066-d348b85a81ac/ovn-acl-logging/0.log" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.084675 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g7pf8_571f6589-a451-476a-9066-d348b85a81ac/ovn-controller/0.log" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.089928 4918 generic.go:334] "Generic (PLEG): container finished" podID="571f6589-a451-476a-9066-d348b85a81ac" containerID="e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e" exitCode=0 Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.089986 4918 generic.go:334] "Generic (PLEG): container finished" podID="571f6589-a451-476a-9066-d348b85a81ac" containerID="b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274" exitCode=0 Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090007 4918 generic.go:334] "Generic (PLEG): container finished" podID="571f6589-a451-476a-9066-d348b85a81ac" containerID="320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc" exitCode=0 Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090024 4918 generic.go:334] "Generic (PLEG): container finished" podID="571f6589-a451-476a-9066-d348b85a81ac" containerID="8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90" exitCode=0 Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090038 4918 generic.go:334] "Generic (PLEG): container finished" podID="571f6589-a451-476a-9066-d348b85a81ac" containerID="5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707" exitCode=0 Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090051 4918 generic.go:334] "Generic (PLEG): container finished" podID="571f6589-a451-476a-9066-d348b85a81ac" containerID="341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6" exitCode=0 Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090063 4918 generic.go:334] "Generic (PLEG): container finished" podID="571f6589-a451-476a-9066-d348b85a81ac" containerID="579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50" exitCode=143 Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090077 4918 generic.go:334] "Generic (PLEG): container finished" podID="571f6589-a451-476a-9066-d348b85a81ac" containerID="bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3" exitCode=143 Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090175 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerDied","Data":"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090211 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090301 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerDied","Data":"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090322 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerDied","Data":"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090336 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerDied","Data":"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090355 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerDied","Data":"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090371 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerDied","Data":"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090391 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090408 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090416 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090428 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerDied","Data":"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090442 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090452 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090461 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090469 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090477 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090485 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090493 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090500 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090507 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090535 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerDied","Data":"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090549 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090559 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090567 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090575 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090600 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090608 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090641 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090653 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090660 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090673 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g7pf8" event={"ID":"571f6589-a451-476a-9066-d348b85a81ac","Type":"ContainerDied","Data":"92b50162d8ee8d283018f0ba26228837511a2a9caa6269acd14bd7dd2f97120c"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090698 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090707 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090715 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090723 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090732 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090740 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090748 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090756 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090784 4918 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.090810 4918 scope.go:117] "RemoveContainer" containerID="e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.096493 4918 generic.go:334] "Generic (PLEG): container finished" podID="4a443d3c-04bd-4528-a34c-a711ff575bdc" containerID="2a9b22607f514fc1368a3b0a5f14e78caa1a9e0c7b9f678788963f421be610c0" exitCode=0 Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.096574 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" event={"ID":"4a443d3c-04bd-4528-a34c-a711ff575bdc","Type":"ContainerDied","Data":"2a9b22607f514fc1368a3b0a5f14e78caa1a9e0c7b9f678788963f421be610c0"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.096613 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" event={"ID":"4a443d3c-04bd-4528-a34c-a711ff575bdc","Type":"ContainerStarted","Data":"0f265a801238d0775ea05c47b67c23330cc4a6327b07dce6c84f28b844af134a"} Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.135713 4918 scope.go:117] "RemoveContainer" containerID="b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.172036 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g7pf8"] Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.177881 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g7pf8"] Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.179735 4918 scope.go:117] "RemoveContainer" containerID="320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.200486 4918 scope.go:117] "RemoveContainer" containerID="8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.222793 4918 scope.go:117] "RemoveContainer" containerID="5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.240257 4918 scope.go:117] "RemoveContainer" containerID="341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.264495 4918 scope.go:117] "RemoveContainer" containerID="579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.296283 4918 scope.go:117] "RemoveContainer" containerID="bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.333631 4918 scope.go:117] "RemoveContainer" containerID="174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.349065 4918 scope.go:117] "RemoveContainer" containerID="e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e" Mar 19 16:51:25 crc kubenswrapper[4918]: E0319 16:51:25.349481 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e\": container with ID starting with e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e not found: ID does not exist" containerID="e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.349572 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e"} err="failed to get container status \"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e\": rpc error: code = NotFound desc = could not find container \"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e\": container with ID starting with e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.349615 4918 scope.go:117] "RemoveContainer" containerID="b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274" Mar 19 16:51:25 crc kubenswrapper[4918]: E0319 16:51:25.350038 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274\": container with ID starting with b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274 not found: ID does not exist" containerID="b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.350094 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274"} err="failed to get container status \"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274\": rpc error: code = NotFound desc = could not find container \"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274\": container with ID starting with b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.350129 4918 scope.go:117] "RemoveContainer" containerID="320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc" Mar 19 16:51:25 crc kubenswrapper[4918]: E0319 16:51:25.350468 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc\": container with ID starting with 320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc not found: ID does not exist" containerID="320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.350507 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc"} err="failed to get container status \"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc\": rpc error: code = NotFound desc = could not find container \"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc\": container with ID starting with 320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.350557 4918 scope.go:117] "RemoveContainer" containerID="8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90" Mar 19 16:51:25 crc kubenswrapper[4918]: E0319 16:51:25.350993 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90\": container with ID starting with 8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90 not found: ID does not exist" containerID="8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.351018 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90"} err="failed to get container status \"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90\": rpc error: code = NotFound desc = could not find container \"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90\": container with ID starting with 8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.351034 4918 scope.go:117] "RemoveContainer" containerID="5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707" Mar 19 16:51:25 crc kubenswrapper[4918]: E0319 16:51:25.351380 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707\": container with ID starting with 5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707 not found: ID does not exist" containerID="5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.351401 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707"} err="failed to get container status \"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707\": rpc error: code = NotFound desc = could not find container \"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707\": container with ID starting with 5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.351414 4918 scope.go:117] "RemoveContainer" containerID="341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6" Mar 19 16:51:25 crc kubenswrapper[4918]: E0319 16:51:25.351884 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6\": container with ID starting with 341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6 not found: ID does not exist" containerID="341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.351910 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6"} err="failed to get container status \"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6\": rpc error: code = NotFound desc = could not find container \"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6\": container with ID starting with 341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.351933 4918 scope.go:117] "RemoveContainer" containerID="579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50" Mar 19 16:51:25 crc kubenswrapper[4918]: E0319 16:51:25.352237 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50\": container with ID starting with 579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50 not found: ID does not exist" containerID="579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.352284 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50"} err="failed to get container status \"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50\": rpc error: code = NotFound desc = could not find container \"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50\": container with ID starting with 579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.352315 4918 scope.go:117] "RemoveContainer" containerID="bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3" Mar 19 16:51:25 crc kubenswrapper[4918]: E0319 16:51:25.352785 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3\": container with ID starting with bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3 not found: ID does not exist" containerID="bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.352808 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3"} err="failed to get container status \"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3\": rpc error: code = NotFound desc = could not find container \"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3\": container with ID starting with bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.352823 4918 scope.go:117] "RemoveContainer" containerID="174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f" Mar 19 16:51:25 crc kubenswrapper[4918]: E0319 16:51:25.353130 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f\": container with ID starting with 174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f not found: ID does not exist" containerID="174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.353173 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f"} err="failed to get container status \"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f\": rpc error: code = NotFound desc = could not find container \"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f\": container with ID starting with 174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.353204 4918 scope.go:117] "RemoveContainer" containerID="e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.353508 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e"} err="failed to get container status \"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e\": rpc error: code = NotFound desc = could not find container \"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e\": container with ID starting with e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.353544 4918 scope.go:117] "RemoveContainer" containerID="b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.353873 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274"} err="failed to get container status \"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274\": rpc error: code = NotFound desc = could not find container \"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274\": container with ID starting with b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.353892 4918 scope.go:117] "RemoveContainer" containerID="320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.354172 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc"} err="failed to get container status \"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc\": rpc error: code = NotFound desc = could not find container \"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc\": container with ID starting with 320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.354213 4918 scope.go:117] "RemoveContainer" containerID="8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.354493 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90"} err="failed to get container status \"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90\": rpc error: code = NotFound desc = could not find container \"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90\": container with ID starting with 8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.354531 4918 scope.go:117] "RemoveContainer" containerID="5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.354840 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707"} err="failed to get container status \"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707\": rpc error: code = NotFound desc = could not find container \"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707\": container with ID starting with 5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.354859 4918 scope.go:117] "RemoveContainer" containerID="341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.355220 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6"} err="failed to get container status \"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6\": rpc error: code = NotFound desc = could not find container \"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6\": container with ID starting with 341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.355241 4918 scope.go:117] "RemoveContainer" containerID="579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.355501 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50"} err="failed to get container status \"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50\": rpc error: code = NotFound desc = could not find container \"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50\": container with ID starting with 579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.355549 4918 scope.go:117] "RemoveContainer" containerID="bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.355889 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3"} err="failed to get container status \"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3\": rpc error: code = NotFound desc = could not find container \"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3\": container with ID starting with bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.355924 4918 scope.go:117] "RemoveContainer" containerID="174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.356306 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f"} err="failed to get container status \"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f\": rpc error: code = NotFound desc = could not find container \"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f\": container with ID starting with 174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.356337 4918 scope.go:117] "RemoveContainer" containerID="e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.356616 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e"} err="failed to get container status \"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e\": rpc error: code = NotFound desc = could not find container \"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e\": container with ID starting with e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.356652 4918 scope.go:117] "RemoveContainer" containerID="b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.357012 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274"} err="failed to get container status \"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274\": rpc error: code = NotFound desc = could not find container \"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274\": container with ID starting with b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.357034 4918 scope.go:117] "RemoveContainer" containerID="320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.357375 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc"} err="failed to get container status \"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc\": rpc error: code = NotFound desc = could not find container \"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc\": container with ID starting with 320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.357435 4918 scope.go:117] "RemoveContainer" containerID="8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.357844 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90"} err="failed to get container status \"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90\": rpc error: code = NotFound desc = could not find container \"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90\": container with ID starting with 8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.357872 4918 scope.go:117] "RemoveContainer" containerID="5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.358330 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707"} err="failed to get container status \"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707\": rpc error: code = NotFound desc = could not find container \"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707\": container with ID starting with 5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.358367 4918 scope.go:117] "RemoveContainer" containerID="341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.358715 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6"} err="failed to get container status \"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6\": rpc error: code = NotFound desc = could not find container \"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6\": container with ID starting with 341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.358753 4918 scope.go:117] "RemoveContainer" containerID="579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.359197 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50"} err="failed to get container status \"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50\": rpc error: code = NotFound desc = could not find container \"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50\": container with ID starting with 579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.359235 4918 scope.go:117] "RemoveContainer" containerID="bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.359494 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3"} err="failed to get container status \"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3\": rpc error: code = NotFound desc = could not find container \"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3\": container with ID starting with bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.359560 4918 scope.go:117] "RemoveContainer" containerID="174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.359939 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f"} err="failed to get container status \"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f\": rpc error: code = NotFound desc = could not find container \"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f\": container with ID starting with 174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.359962 4918 scope.go:117] "RemoveContainer" containerID="e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.360264 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e"} err="failed to get container status \"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e\": rpc error: code = NotFound desc = could not find container \"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e\": container with ID starting with e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.360295 4918 scope.go:117] "RemoveContainer" containerID="b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.360656 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274"} err="failed to get container status \"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274\": rpc error: code = NotFound desc = could not find container \"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274\": container with ID starting with b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.360689 4918 scope.go:117] "RemoveContainer" containerID="320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.360990 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc"} err="failed to get container status \"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc\": rpc error: code = NotFound desc = could not find container \"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc\": container with ID starting with 320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.361017 4918 scope.go:117] "RemoveContainer" containerID="8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.361309 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90"} err="failed to get container status \"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90\": rpc error: code = NotFound desc = could not find container \"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90\": container with ID starting with 8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.361349 4918 scope.go:117] "RemoveContainer" containerID="5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.361668 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707"} err="failed to get container status \"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707\": rpc error: code = NotFound desc = could not find container \"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707\": container with ID starting with 5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.361691 4918 scope.go:117] "RemoveContainer" containerID="341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.361982 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6"} err="failed to get container status \"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6\": rpc error: code = NotFound desc = could not find container \"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6\": container with ID starting with 341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.362023 4918 scope.go:117] "RemoveContainer" containerID="579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.362277 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50"} err="failed to get container status \"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50\": rpc error: code = NotFound desc = could not find container \"579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50\": container with ID starting with 579dd8a5bb1304350027f67086243553c89e51677371700c430a0d7f00c5ad50 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.362302 4918 scope.go:117] "RemoveContainer" containerID="bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.362613 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3"} err="failed to get container status \"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3\": rpc error: code = NotFound desc = could not find container \"bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3\": container with ID starting with bcb44a4dc92ed9ffad3f7b6eb2c0d26de38b60178ed1c5ed92ec7d9a0cc751e3 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.362640 4918 scope.go:117] "RemoveContainer" containerID="174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.362919 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f"} err="failed to get container status \"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f\": rpc error: code = NotFound desc = could not find container \"174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f\": container with ID starting with 174b491f21f8749c3439f1c7da7fd5ab546b3ba8fa510e6e24f90d058e97668f not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.362943 4918 scope.go:117] "RemoveContainer" containerID="e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.363294 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e"} err="failed to get container status \"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e\": rpc error: code = NotFound desc = could not find container \"e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e\": container with ID starting with e13b5a0985eb3c126ac7d11a13a535c6b9f532bde70ef6b8d2a5f5a8bdef9c5e not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.363318 4918 scope.go:117] "RemoveContainer" containerID="b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.363630 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274"} err="failed to get container status \"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274\": rpc error: code = NotFound desc = could not find container \"b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274\": container with ID starting with b2f98e5e2341c26cdc2a8461f78c1f39da1c9b37d0f674073ea236536d98c274 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.363673 4918 scope.go:117] "RemoveContainer" containerID="320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.363965 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc"} err="failed to get container status \"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc\": rpc error: code = NotFound desc = could not find container \"320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc\": container with ID starting with 320c200d2dc42deb210382f9210dd629d7cbba4a9545b5fdf62cc806b475fadc not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.363990 4918 scope.go:117] "RemoveContainer" containerID="8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.364263 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90"} err="failed to get container status \"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90\": rpc error: code = NotFound desc = could not find container \"8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90\": container with ID starting with 8dbecd653ae3baac877f6570e342cf17bf833564e64e6efdb5ebd13645ee0f90 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.364302 4918 scope.go:117] "RemoveContainer" containerID="5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.364729 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707"} err="failed to get container status \"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707\": rpc error: code = NotFound desc = could not find container \"5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707\": container with ID starting with 5abba75dc14db52f602b13e6ac19249272c367843b5abfc4a05b49429d0a6707 not found: ID does not exist" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.364751 4918 scope.go:117] "RemoveContainer" containerID="341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6" Mar 19 16:51:25 crc kubenswrapper[4918]: I0319 16:51:25.365072 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6"} err="failed to get container status \"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6\": rpc error: code = NotFound desc = could not find container \"341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6\": container with ID starting with 341133ed338fffb9b1fb9b0044652a1597a0f7ccfdc1b07736297dfdc8e47fe6 not found: ID does not exist" Mar 19 16:51:26 crc kubenswrapper[4918]: I0319 16:51:26.109615 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" event={"ID":"4a443d3c-04bd-4528-a34c-a711ff575bdc","Type":"ContainerStarted","Data":"56380310403f7b1f8888046ff25c1cf9e4538aa417aa2fe92b946f226c028094"} Mar 19 16:51:26 crc kubenswrapper[4918]: I0319 16:51:26.109673 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" event={"ID":"4a443d3c-04bd-4528-a34c-a711ff575bdc","Type":"ContainerStarted","Data":"bb3bd09a6a163940a326567ea2518d580983050c73525fe6e53f014995965206"} Mar 19 16:51:26 crc kubenswrapper[4918]: I0319 16:51:26.109699 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" event={"ID":"4a443d3c-04bd-4528-a34c-a711ff575bdc","Type":"ContainerStarted","Data":"7878bb8fd8934e9ad5c45904f80f1379d63dee26eb62b3155322e96cf4d702e6"} Mar 19 16:51:26 crc kubenswrapper[4918]: I0319 16:51:26.109718 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" event={"ID":"4a443d3c-04bd-4528-a34c-a711ff575bdc","Type":"ContainerStarted","Data":"6e937a63aade6071a8c0e46b8dbb0f0c1e9cd6fabd63563538694eb0985f7882"} Mar 19 16:51:26 crc kubenswrapper[4918]: I0319 16:51:26.109734 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" event={"ID":"4a443d3c-04bd-4528-a34c-a711ff575bdc","Type":"ContainerStarted","Data":"fb14e6e4bb9ec0c68d7a4fb2f4cc530996bbe21a53298b14315a73a8291304e3"} Mar 19 16:51:26 crc kubenswrapper[4918]: I0319 16:51:26.109754 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" event={"ID":"4a443d3c-04bd-4528-a34c-a711ff575bdc","Type":"ContainerStarted","Data":"e3bbce8a31f94fbe0aab406ff36ef99d897e1628650ff87399f720bd0ecf48ae"} Mar 19 16:51:26 crc kubenswrapper[4918]: I0319 16:51:26.112015 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m2sxj_c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3/kube-multus/0.log" Mar 19 16:51:26 crc kubenswrapper[4918]: I0319 16:51:26.112087 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m2sxj" event={"ID":"c60e5863-5ae9-4d4a-81d0-ec1a1e57e7d3","Type":"ContainerStarted","Data":"734d8c49c383d8921e49615ae1bd91c4564463d9dcfd277ee66b0f643670ac80"} Mar 19 16:51:26 crc kubenswrapper[4918]: I0319 16:51:26.595305 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571f6589-a451-476a-9066-d348b85a81ac" path="/var/lib/kubelet/pods/571f6589-a451-476a-9066-d348b85a81ac/volumes" Mar 19 16:51:29 crc kubenswrapper[4918]: I0319 16:51:29.144321 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" event={"ID":"4a443d3c-04bd-4528-a34c-a711ff575bdc","Type":"ContainerStarted","Data":"71d8a4f24be20545fc690776be40ff17e067fc0c5450b9fa784a81ce3b299d9a"} Mar 19 16:51:31 crc kubenswrapper[4918]: I0319 16:51:31.160956 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" event={"ID":"4a443d3c-04bd-4528-a34c-a711ff575bdc","Type":"ContainerStarted","Data":"6ed96f718374fa68753c0378884c6585a1a537eebf8498826ea46fb09e00d30b"} Mar 19 16:51:31 crc kubenswrapper[4918]: I0319 16:51:31.161482 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:31 crc kubenswrapper[4918]: I0319 16:51:31.161495 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:31 crc kubenswrapper[4918]: I0319 16:51:31.161504 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:31 crc kubenswrapper[4918]: I0319 16:51:31.226160 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:31 crc kubenswrapper[4918]: I0319 16:51:31.245205 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:31 crc kubenswrapper[4918]: I0319 16:51:31.295952 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" podStartSLOduration=7.295935597 podStartE2EDuration="7.295935597s" podCreationTimestamp="2026-03-19 16:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:51:31.211982488 +0000 UTC m=+703.334181736" watchObservedRunningTime="2026-03-19 16:51:31.295935597 +0000 UTC m=+703.418134845" Mar 19 16:51:36 crc kubenswrapper[4918]: I0319 16:51:36.764297 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-4j7mv"] Mar 19 16:51:36 crc kubenswrapper[4918]: I0319 16:51:36.765502 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-4j7mv" Mar 19 16:51:36 crc kubenswrapper[4918]: I0319 16:51:36.767572 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 19 16:51:36 crc kubenswrapper[4918]: I0319 16:51:36.774391 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 19 16:51:36 crc kubenswrapper[4918]: I0319 16:51:36.780003 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-4j7mv"] Mar 19 16:51:36 crc kubenswrapper[4918]: I0319 16:51:36.780216 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-dwgrr" Mar 19 16:51:36 crc kubenswrapper[4918]: I0319 16:51:36.861083 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sk8c\" (UniqueName: \"kubernetes.io/projected/7474b087-0f69-4aaa-a238-d3a0c1cee280-kube-api-access-7sk8c\") pod \"obo-prometheus-operator-8ff7d675-4j7mv\" (UID: \"7474b087-0f69-4aaa-a238-d3a0c1cee280\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-4j7mv" Mar 19 16:51:36 crc kubenswrapper[4918]: I0319 16:51:36.962462 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sk8c\" (UniqueName: \"kubernetes.io/projected/7474b087-0f69-4aaa-a238-d3a0c1cee280-kube-api-access-7sk8c\") pod \"obo-prometheus-operator-8ff7d675-4j7mv\" (UID: \"7474b087-0f69-4aaa-a238-d3a0c1cee280\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-4j7mv" Mar 19 16:51:36 crc kubenswrapper[4918]: I0319 16:51:36.991383 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sk8c\" (UniqueName: \"kubernetes.io/projected/7474b087-0f69-4aaa-a238-d3a0c1cee280-kube-api-access-7sk8c\") pod \"obo-prometheus-operator-8ff7d675-4j7mv\" (UID: \"7474b087-0f69-4aaa-a238-d3a0c1cee280\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-4j7mv" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.086179 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-4j7mv" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.150802 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67"] Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.151434 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.153103 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-bhdtx" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.153104 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.174044 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg"] Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.174687 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.186901 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg"] Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.223455 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67"] Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.270172 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca575a9e-34c3-4601-9d7c-a0033202f67c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg\" (UID: \"ca575a9e-34c3-4601-9d7c-a0033202f67c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.270224 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e20191af-404b-4afc-ba99-628844e9cf89-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-zlq67\" (UID: \"e20191af-404b-4afc-ba99-628844e9cf89\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.270250 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca575a9e-34c3-4601-9d7c-a0033202f67c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg\" (UID: \"ca575a9e-34c3-4601-9d7c-a0033202f67c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.270672 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e20191af-404b-4afc-ba99-628844e9cf89-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-zlq67\" (UID: \"e20191af-404b-4afc-ba99-628844e9cf89\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.371462 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e20191af-404b-4afc-ba99-628844e9cf89-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-zlq67\" (UID: \"e20191af-404b-4afc-ba99-628844e9cf89\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.371594 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca575a9e-34c3-4601-9d7c-a0033202f67c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg\" (UID: \"ca575a9e-34c3-4601-9d7c-a0033202f67c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.371632 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e20191af-404b-4afc-ba99-628844e9cf89-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-zlq67\" (UID: \"e20191af-404b-4afc-ba99-628844e9cf89\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.371660 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca575a9e-34c3-4601-9d7c-a0033202f67c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg\" (UID: \"ca575a9e-34c3-4601-9d7c-a0033202f67c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.379490 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e20191af-404b-4afc-ba99-628844e9cf89-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-zlq67\" (UID: \"e20191af-404b-4afc-ba99-628844e9cf89\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.379551 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca575a9e-34c3-4601-9d7c-a0033202f67c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg\" (UID: \"ca575a9e-34c3-4601-9d7c-a0033202f67c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.381984 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e20191af-404b-4afc-ba99-628844e9cf89-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-zlq67\" (UID: \"e20191af-404b-4afc-ba99-628844e9cf89\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.394037 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca575a9e-34c3-4601-9d7c-a0033202f67c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg\" (UID: \"ca575a9e-34c3-4601-9d7c-a0033202f67c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.398817 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-4j7mv"] Mar 19 16:51:37 crc kubenswrapper[4918]: W0319 16:51:37.404055 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7474b087_0f69_4aaa_a238_d3a0c1cee280.slice/crio-a4174ccd90710f9e0db3782ea79f23e69aa54d33ad9ae8aeb062031bb04c6354 WatchSource:0}: Error finding container a4174ccd90710f9e0db3782ea79f23e69aa54d33ad9ae8aeb062031bb04c6354: Status 404 returned error can't find the container with id a4174ccd90710f9e0db3782ea79f23e69aa54d33ad9ae8aeb062031bb04c6354 Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.462145 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-lt8g5"] Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.462842 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.464745 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-gdwzj" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.464887 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.470400 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.474827 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-lt8g5"] Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.493996 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.574740 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-lt8g5\" (UID: \"c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2\") " pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.574836 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdg55\" (UniqueName: \"kubernetes.io/projected/c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2-kube-api-access-zdg55\") pod \"observability-operator-6dd7dd855f-lt8g5\" (UID: \"c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2\") " pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.676062 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-lt8g5\" (UID: \"c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2\") " pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.676200 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdg55\" (UniqueName: \"kubernetes.io/projected/c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2-kube-api-access-zdg55\") pod \"observability-operator-6dd7dd855f-lt8g5\" (UID: \"c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2\") " pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.680664 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-lt8g5\" (UID: \"c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2\") " pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.698703 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdg55\" (UniqueName: \"kubernetes.io/projected/c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2-kube-api-access-zdg55\") pod \"observability-operator-6dd7dd855f-lt8g5\" (UID: \"c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2\") " pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.779256 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67"] Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.780926 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" Mar 19 16:51:37 crc kubenswrapper[4918]: W0319 16:51:37.800790 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode20191af_404b_4afc_ba99_628844e9cf89.slice/crio-580799405df392c9b7c1733fec8c4de87d6845a14099ddc47743a01cb1d2901c WatchSource:0}: Error finding container 580799405df392c9b7c1733fec8c4de87d6845a14099ddc47743a01cb1d2901c: Status 404 returned error can't find the container with id 580799405df392c9b7c1733fec8c4de87d6845a14099ddc47743a01cb1d2901c Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.837726 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg"] Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.914239 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-6879549846-2tnz2"] Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.914902 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.919074 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-lq49g" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.919434 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.973600 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-6879549846-2tnz2"] Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.980987 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ba838ce-ff3e-4419-b143-b42b064b28fd-apiservice-cert\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.981026 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgrnl\" (UniqueName: \"kubernetes.io/projected/5ba838ce-ff3e-4419-b143-b42b064b28fd-kube-api-access-wgrnl\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.981066 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ba838ce-ff3e-4419-b143-b42b064b28fd-openshift-service-ca\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:37 crc kubenswrapper[4918]: I0319 16:51:37.981084 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ba838ce-ff3e-4419-b143-b42b064b28fd-webhook-cert\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.035044 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-lt8g5"] Mar 19 16:51:38 crc kubenswrapper[4918]: W0319 16:51:38.043060 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc04d1ac3_0ecd_4455_a83d_8b2468d2b7c2.slice/crio-5e04d428df91fcb912634ef82d1eaadcb252167638bdd7e94341a822e9807be9 WatchSource:0}: Error finding container 5e04d428df91fcb912634ef82d1eaadcb252167638bdd7e94341a822e9807be9: Status 404 returned error can't find the container with id 5e04d428df91fcb912634ef82d1eaadcb252167638bdd7e94341a822e9807be9 Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.083806 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ba838ce-ff3e-4419-b143-b42b064b28fd-openshift-service-ca\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.083857 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ba838ce-ff3e-4419-b143-b42b064b28fd-webhook-cert\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.083940 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ba838ce-ff3e-4419-b143-b42b064b28fd-apiservice-cert\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.083962 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgrnl\" (UniqueName: \"kubernetes.io/projected/5ba838ce-ff3e-4419-b143-b42b064b28fd-kube-api-access-wgrnl\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.102606 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ba838ce-ff3e-4419-b143-b42b064b28fd-openshift-service-ca\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.104061 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ba838ce-ff3e-4419-b143-b42b064b28fd-webhook-cert\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.118815 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgrnl\" (UniqueName: \"kubernetes.io/projected/5ba838ce-ff3e-4419-b143-b42b064b28fd-kube-api-access-wgrnl\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.122334 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ba838ce-ff3e-4419-b143-b42b064b28fd-apiservice-cert\") pod \"perses-operator-6879549846-2tnz2\" (UID: \"5ba838ce-ff3e-4419-b143-b42b064b28fd\") " pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.198436 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg" event={"ID":"ca575a9e-34c3-4601-9d7c-a0033202f67c","Type":"ContainerStarted","Data":"649d4a918dd70e3b6bb11abf9cbaf053ef24af4d40b3fc5d356663f085f82b47"} Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.199384 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67" event={"ID":"e20191af-404b-4afc-ba99-628844e9cf89","Type":"ContainerStarted","Data":"580799405df392c9b7c1733fec8c4de87d6845a14099ddc47743a01cb1d2901c"} Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.200297 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-4j7mv" event={"ID":"7474b087-0f69-4aaa-a238-d3a0c1cee280","Type":"ContainerStarted","Data":"a4174ccd90710f9e0db3782ea79f23e69aa54d33ad9ae8aeb062031bb04c6354"} Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.201219 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" event={"ID":"c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2","Type":"ContainerStarted","Data":"5e04d428df91fcb912634ef82d1eaadcb252167638bdd7e94341a822e9807be9"} Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.245025 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:38 crc kubenswrapper[4918]: I0319 16:51:38.493594 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-6879549846-2tnz2"] Mar 19 16:51:38 crc kubenswrapper[4918]: W0319 16:51:38.494296 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba838ce_ff3e_4419_b143_b42b064b28fd.slice/crio-8f243911ca9c3c41f8ef69697dfc9e6ae216056f31c06636fe45eb7b8af930c5 WatchSource:0}: Error finding container 8f243911ca9c3c41f8ef69697dfc9e6ae216056f31c06636fe45eb7b8af930c5: Status 404 returned error can't find the container with id 8f243911ca9c3c41f8ef69697dfc9e6ae216056f31c06636fe45eb7b8af930c5 Mar 19 16:51:39 crc kubenswrapper[4918]: I0319 16:51:39.213949 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-6879549846-2tnz2" event={"ID":"5ba838ce-ff3e-4419-b143-b42b064b28fd","Type":"ContainerStarted","Data":"8f243911ca9c3c41f8ef69697dfc9e6ae216056f31c06636fe45eb7b8af930c5"} Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.286378 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67" event={"ID":"e20191af-404b-4afc-ba99-628844e9cf89","Type":"ContainerStarted","Data":"b008709ec9ac36df149c9899a93ee9fb0110461dc6a3544b16e5cdf643c464b3"} Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.287580 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-6879549846-2tnz2" event={"ID":"5ba838ce-ff3e-4419-b143-b42b064b28fd","Type":"ContainerStarted","Data":"0605c75d73989d226cf020d6adcc9fd202030d10f282b1141163f3afa76c5d15"} Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.287886 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.289268 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" event={"ID":"c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2","Type":"ContainerStarted","Data":"ff24bfe0db50882ebea0e839932ea46140d9e58d35bf1dea560b104f1f677938"} Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.289460 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.290621 4918 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-lt8g5 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.17:8081/healthz\": dial tcp 10.217.0.17:8081: connect: connection refused" start-of-body= Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.290668 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" podUID="c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.17:8081/healthz\": dial tcp 10.217.0.17:8081: connect: connection refused" Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.291894 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg" event={"ID":"ca575a9e-34c3-4601-9d7c-a0033202f67c","Type":"ContainerStarted","Data":"07e8150ac08370c2715de358e9fcaac7b3fcae5d99d86cf78d7fe88b940b7875"} Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.307350 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-zlq67" podStartSLOduration=1.1442741459999999 podStartE2EDuration="11.307332745s" podCreationTimestamp="2026-03-19 16:51:37 +0000 UTC" firstStartedPulling="2026-03-19 16:51:37.822714756 +0000 UTC m=+709.944914004" lastFinishedPulling="2026-03-19 16:51:47.985773355 +0000 UTC m=+720.107972603" observedRunningTime="2026-03-19 16:51:48.307299234 +0000 UTC m=+720.429498512" watchObservedRunningTime="2026-03-19 16:51:48.307332745 +0000 UTC m=+720.429531993" Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.360577 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" podStartSLOduration=1.410636284 podStartE2EDuration="11.360543769s" podCreationTimestamp="2026-03-19 16:51:37 +0000 UTC" firstStartedPulling="2026-03-19 16:51:38.059514386 +0000 UTC m=+710.181713634" lastFinishedPulling="2026-03-19 16:51:48.009421871 +0000 UTC m=+720.131621119" observedRunningTime="2026-03-19 16:51:48.354726759 +0000 UTC m=+720.476926007" watchObservedRunningTime="2026-03-19 16:51:48.360543769 +0000 UTC m=+720.482743027" Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.373724 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg" podStartSLOduration=1.243311441 podStartE2EDuration="11.373704238s" podCreationTimestamp="2026-03-19 16:51:37 +0000 UTC" firstStartedPulling="2026-03-19 16:51:37.850462983 +0000 UTC m=+709.972662231" lastFinishedPulling="2026-03-19 16:51:47.98085578 +0000 UTC m=+720.103055028" observedRunningTime="2026-03-19 16:51:48.369583686 +0000 UTC m=+720.491782934" watchObservedRunningTime="2026-03-19 16:51:48.373704238 +0000 UTC m=+720.495903486" Mar 19 16:51:48 crc kubenswrapper[4918]: I0319 16:51:48.394569 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-6879549846-2tnz2" podStartSLOduration=1.8817220959999998 podStartE2EDuration="11.394553628s" podCreationTimestamp="2026-03-19 16:51:37 +0000 UTC" firstStartedPulling="2026-03-19 16:51:38.49591924 +0000 UTC m=+710.618118488" lastFinishedPulling="2026-03-19 16:51:48.008750772 +0000 UTC m=+720.130950020" observedRunningTime="2026-03-19 16:51:48.391547757 +0000 UTC m=+720.513747005" watchObservedRunningTime="2026-03-19 16:51:48.394553628 +0000 UTC m=+720.516752876" Mar 19 16:51:49 crc kubenswrapper[4918]: I0319 16:51:49.298732 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-4j7mv" event={"ID":"7474b087-0f69-4aaa-a238-d3a0c1cee280","Type":"ContainerStarted","Data":"0fc82c2bcf1c625658cb76270ac0a1ab21e242faf3e3251bf8b00c9ba1f94ce4"} Mar 19 16:51:49 crc kubenswrapper[4918]: I0319 16:51:49.300352 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-lt8g5" Mar 19 16:51:49 crc kubenswrapper[4918]: I0319 16:51:49.324263 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-4j7mv" podStartSLOduration=2.7467209649999997 podStartE2EDuration="13.324245191s" podCreationTimestamp="2026-03-19 16:51:36 +0000 UTC" firstStartedPulling="2026-03-19 16:51:37.407393865 +0000 UTC m=+709.529593113" lastFinishedPulling="2026-03-19 16:51:47.984918091 +0000 UTC m=+720.107117339" observedRunningTime="2026-03-19 16:51:49.319723768 +0000 UTC m=+721.441923016" watchObservedRunningTime="2026-03-19 16:51:49.324245191 +0000 UTC m=+721.446444439" Mar 19 16:51:49 crc kubenswrapper[4918]: I0319 16:51:49.564705 4918 scope.go:117] "RemoveContainer" containerID="1a8629774e93e4581b9954cfe6afcd278d1928f9fd7274d68dc6f279e8e0afe5" Mar 19 16:51:54 crc kubenswrapper[4918]: I0319 16:51:54.906816 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-888rm" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.322564 4918 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.660494 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-nrk5x"] Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.661293 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-nrk5x" Mar 19 16:51:56 crc kubenswrapper[4918]: W0319 16:51:56.662673 4918 reflector.go:561] object-"cert-manager"/"cert-manager-dockercfg-dczqc": failed to list *v1.Secret: secrets "cert-manager-dockercfg-dczqc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "cert-manager": no relationship found between node 'crc' and this object Mar 19 16:51:56 crc kubenswrapper[4918]: W0319 16:51:56.662710 4918 reflector.go:561] object-"cert-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "cert-manager": no relationship found between node 'crc' and this object Mar 19 16:51:56 crc kubenswrapper[4918]: E0319 16:51:56.662732 4918 reflector.go:158] "Unhandled Error" err="object-\"cert-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"cert-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 16:51:56 crc kubenswrapper[4918]: E0319 16:51:56.662730 4918 reflector.go:158] "Unhandled Error" err="object-\"cert-manager\"/\"cert-manager-dockercfg-dczqc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-manager-dockercfg-dczqc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"cert-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 16:51:56 crc kubenswrapper[4918]: W0319 16:51:56.664195 4918 reflector.go:561] object-"cert-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "cert-manager": no relationship found between node 'crc' and this object Mar 19 16:51:56 crc kubenswrapper[4918]: E0319 16:51:56.664328 4918 reflector.go:158] "Unhandled Error" err="object-\"cert-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"cert-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.668418 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4fns8"] Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.678796 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4fns8" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.681899 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-nrk5x"] Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.681936 4918 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-9k4br" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.686014 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4fns8"] Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.690503 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6frc2"] Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.691279 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6frc2" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.693452 4918 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-r9ts8" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.713651 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6frc2"] Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.728403 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxv8f\" (UniqueName: \"kubernetes.io/projected/308372f5-f777-42d6-a57d-76ea6928f45e-kube-api-access-hxv8f\") pod \"cert-manager-cainjector-cf98fcc89-4fns8\" (UID: \"308372f5-f777-42d6-a57d-76ea6928f45e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4fns8" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.728482 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2hsv\" (UniqueName: \"kubernetes.io/projected/acfafa9a-b5e2-4385-8820-dcc726d38058-kube-api-access-d2hsv\") pod \"cert-manager-858654f9db-nrk5x\" (UID: \"acfafa9a-b5e2-4385-8820-dcc726d38058\") " pod="cert-manager/cert-manager-858654f9db-nrk5x" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.728545 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbptb\" (UniqueName: \"kubernetes.io/projected/e7570f30-edef-4862-bc00-e72ce89ff640-kube-api-access-rbptb\") pod \"cert-manager-webhook-687f57d79b-6frc2\" (UID: \"e7570f30-edef-4862-bc00-e72ce89ff640\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6frc2" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.830059 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2hsv\" (UniqueName: \"kubernetes.io/projected/acfafa9a-b5e2-4385-8820-dcc726d38058-kube-api-access-d2hsv\") pod \"cert-manager-858654f9db-nrk5x\" (UID: \"acfafa9a-b5e2-4385-8820-dcc726d38058\") " pod="cert-manager/cert-manager-858654f9db-nrk5x" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.830125 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbptb\" (UniqueName: \"kubernetes.io/projected/e7570f30-edef-4862-bc00-e72ce89ff640-kube-api-access-rbptb\") pod \"cert-manager-webhook-687f57d79b-6frc2\" (UID: \"e7570f30-edef-4862-bc00-e72ce89ff640\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6frc2" Mar 19 16:51:56 crc kubenswrapper[4918]: I0319 16:51:56.830160 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxv8f\" (UniqueName: \"kubernetes.io/projected/308372f5-f777-42d6-a57d-76ea6928f45e-kube-api-access-hxv8f\") pod \"cert-manager-cainjector-cf98fcc89-4fns8\" (UID: \"308372f5-f777-42d6-a57d-76ea6928f45e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4fns8" Mar 19 16:51:57 crc kubenswrapper[4918]: I0319 16:51:57.468989 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.042940 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.052429 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxv8f\" (UniqueName: \"kubernetes.io/projected/308372f5-f777-42d6-a57d-76ea6928f45e-kube-api-access-hxv8f\") pod \"cert-manager-cainjector-cf98fcc89-4fns8\" (UID: \"308372f5-f777-42d6-a57d-76ea6928f45e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4fns8" Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.053974 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbptb\" (UniqueName: \"kubernetes.io/projected/e7570f30-edef-4862-bc00-e72ce89ff640-kube-api-access-rbptb\") pod \"cert-manager-webhook-687f57d79b-6frc2\" (UID: \"e7570f30-edef-4862-bc00-e72ce89ff640\") " pod="cert-manager/cert-manager-webhook-687f57d79b-6frc2" Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.056571 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2hsv\" (UniqueName: \"kubernetes.io/projected/acfafa9a-b5e2-4385-8820-dcc726d38058-kube-api-access-d2hsv\") pod \"cert-manager-858654f9db-nrk5x\" (UID: \"acfafa9a-b5e2-4385-8820-dcc726d38058\") " pod="cert-manager/cert-manager-858654f9db-nrk5x" Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.165480 4918 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-dczqc" Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.178549 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-nrk5x" Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.193128 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4fns8" Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.227212 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-6frc2" Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.247818 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-6879549846-2tnz2" Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.606268 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-nrk5x"] Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.633280 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4fns8"] Mar 19 16:51:58 crc kubenswrapper[4918]: I0319 16:51:58.695109 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-6frc2"] Mar 19 16:51:59 crc kubenswrapper[4918]: I0319 16:51:59.380512 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-nrk5x" event={"ID":"acfafa9a-b5e2-4385-8820-dcc726d38058","Type":"ContainerStarted","Data":"e805e8ff1a714d6d807320a7121458d1d304a71b22ad27f754161e7bb5517d2f"} Mar 19 16:51:59 crc kubenswrapper[4918]: I0319 16:51:59.381681 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-6frc2" event={"ID":"e7570f30-edef-4862-bc00-e72ce89ff640","Type":"ContainerStarted","Data":"ea30974adc21af6bb471441651d582642017ae4eebac014ff9d16c408287c218"} Mar 19 16:51:59 crc kubenswrapper[4918]: I0319 16:51:59.382681 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4fns8" event={"ID":"308372f5-f777-42d6-a57d-76ea6928f45e","Type":"ContainerStarted","Data":"94fc6a5aef709106e70664aacf8631cfe756134ee4909522c543cd763939ee5a"} Mar 19 16:52:00 crc kubenswrapper[4918]: I0319 16:52:00.128434 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565652-wcs4m"] Mar 19 16:52:00 crc kubenswrapper[4918]: I0319 16:52:00.129292 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565652-wcs4m" Mar 19 16:52:00 crc kubenswrapper[4918]: I0319 16:52:00.131050 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 16:52:00 crc kubenswrapper[4918]: I0319 16:52:00.131540 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:52:00 crc kubenswrapper[4918]: I0319 16:52:00.131883 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:52:00 crc kubenswrapper[4918]: I0319 16:52:00.149082 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565652-wcs4m"] Mar 19 16:52:00 crc kubenswrapper[4918]: I0319 16:52:00.182350 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc2x4\" (UniqueName: \"kubernetes.io/projected/31ec09cf-288d-4ac1-ab4a-9027d53ae433-kube-api-access-zc2x4\") pod \"auto-csr-approver-29565652-wcs4m\" (UID: \"31ec09cf-288d-4ac1-ab4a-9027d53ae433\") " pod="openshift-infra/auto-csr-approver-29565652-wcs4m" Mar 19 16:52:00 crc kubenswrapper[4918]: I0319 16:52:00.283922 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc2x4\" (UniqueName: \"kubernetes.io/projected/31ec09cf-288d-4ac1-ab4a-9027d53ae433-kube-api-access-zc2x4\") pod \"auto-csr-approver-29565652-wcs4m\" (UID: \"31ec09cf-288d-4ac1-ab4a-9027d53ae433\") " pod="openshift-infra/auto-csr-approver-29565652-wcs4m" Mar 19 16:52:00 crc kubenswrapper[4918]: I0319 16:52:00.324347 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc2x4\" (UniqueName: \"kubernetes.io/projected/31ec09cf-288d-4ac1-ab4a-9027d53ae433-kube-api-access-zc2x4\") pod \"auto-csr-approver-29565652-wcs4m\" (UID: \"31ec09cf-288d-4ac1-ab4a-9027d53ae433\") " pod="openshift-infra/auto-csr-approver-29565652-wcs4m" Mar 19 16:52:00 crc kubenswrapper[4918]: I0319 16:52:00.455026 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565652-wcs4m" Mar 19 16:52:01 crc kubenswrapper[4918]: I0319 16:52:01.215657 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565652-wcs4m"] Mar 19 16:52:02 crc kubenswrapper[4918]: W0319 16:52:02.550710 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31ec09cf_288d_4ac1_ab4a_9027d53ae433.slice/crio-4d8e40d021538fd749ffa988077af19335ff82ecd92e5fce92a8d8f0d1a33706 WatchSource:0}: Error finding container 4d8e40d021538fd749ffa988077af19335ff82ecd92e5fce92a8d8f0d1a33706: Status 404 returned error can't find the container with id 4d8e40d021538fd749ffa988077af19335ff82ecd92e5fce92a8d8f0d1a33706 Mar 19 16:52:03 crc kubenswrapper[4918]: I0319 16:52:03.416658 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565652-wcs4m" event={"ID":"31ec09cf-288d-4ac1-ab4a-9027d53ae433","Type":"ContainerStarted","Data":"4d8e40d021538fd749ffa988077af19335ff82ecd92e5fce92a8d8f0d1a33706"} Mar 19 16:52:04 crc kubenswrapper[4918]: I0319 16:52:04.429969 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565652-wcs4m" event={"ID":"31ec09cf-288d-4ac1-ab4a-9027d53ae433","Type":"ContainerStarted","Data":"88237b91668b89f0a6d53af7b3d7c0223f45c777225b06d07a111ac2339f6426"} Mar 19 16:52:04 crc kubenswrapper[4918]: I0319 16:52:04.432344 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-6frc2" event={"ID":"e7570f30-edef-4862-bc00-e72ce89ff640","Type":"ContainerStarted","Data":"23745c7aa0759e55bc8411266115d28c6d184ccee80a67ae76669250c25df911"} Mar 19 16:52:04 crc kubenswrapper[4918]: I0319 16:52:04.432421 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-6frc2" Mar 19 16:52:04 crc kubenswrapper[4918]: I0319 16:52:04.433875 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4fns8" event={"ID":"308372f5-f777-42d6-a57d-76ea6928f45e","Type":"ContainerStarted","Data":"64d14a0e5666f32342900c373e0a1965ebfe03429addff5d12b85224f269ffd8"} Mar 19 16:52:04 crc kubenswrapper[4918]: I0319 16:52:04.444929 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565652-wcs4m" podStartSLOduration=3.242065045 podStartE2EDuration="4.444914584s" podCreationTimestamp="2026-03-19 16:52:00 +0000 UTC" firstStartedPulling="2026-03-19 16:52:02.657717303 +0000 UTC m=+734.779916551" lastFinishedPulling="2026-03-19 16:52:03.860566842 +0000 UTC m=+735.982766090" observedRunningTime="2026-03-19 16:52:04.443393163 +0000 UTC m=+736.565592411" watchObservedRunningTime="2026-03-19 16:52:04.444914584 +0000 UTC m=+736.567113832" Mar 19 16:52:04 crc kubenswrapper[4918]: I0319 16:52:04.476900 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4fns8" podStartSLOduration=3.261756895 podStartE2EDuration="8.476885588s" podCreationTimestamp="2026-03-19 16:51:56 +0000 UTC" firstStartedPulling="2026-03-19 16:51:58.644273147 +0000 UTC m=+730.766472395" lastFinishedPulling="2026-03-19 16:52:03.85940184 +0000 UTC m=+735.981601088" observedRunningTime="2026-03-19 16:52:04.458977268 +0000 UTC m=+736.581176516" watchObservedRunningTime="2026-03-19 16:52:04.476885588 +0000 UTC m=+736.599084836" Mar 19 16:52:05 crc kubenswrapper[4918]: I0319 16:52:05.443481 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-nrk5x" event={"ID":"acfafa9a-b5e2-4385-8820-dcc726d38058","Type":"ContainerStarted","Data":"821390a7ea9ed64b960a36701695e5d0e8b3dec7b78c7017a996dde9695b6133"} Mar 19 16:52:05 crc kubenswrapper[4918]: I0319 16:52:05.445652 4918 generic.go:334] "Generic (PLEG): container finished" podID="31ec09cf-288d-4ac1-ab4a-9027d53ae433" containerID="88237b91668b89f0a6d53af7b3d7c0223f45c777225b06d07a111ac2339f6426" exitCode=0 Mar 19 16:52:05 crc kubenswrapper[4918]: I0319 16:52:05.445704 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565652-wcs4m" event={"ID":"31ec09cf-288d-4ac1-ab4a-9027d53ae433","Type":"ContainerDied","Data":"88237b91668b89f0a6d53af7b3d7c0223f45c777225b06d07a111ac2339f6426"} Mar 19 16:52:05 crc kubenswrapper[4918]: I0319 16:52:05.464668 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-6frc2" podStartSLOduration=4.395870116 podStartE2EDuration="9.464650868s" podCreationTimestamp="2026-03-19 16:51:56 +0000 UTC" firstStartedPulling="2026-03-19 16:51:58.700250547 +0000 UTC m=+730.822449795" lastFinishedPulling="2026-03-19 16:52:03.769031299 +0000 UTC m=+735.891230547" observedRunningTime="2026-03-19 16:52:04.478005029 +0000 UTC m=+736.600204277" watchObservedRunningTime="2026-03-19 16:52:05.464650868 +0000 UTC m=+737.586850126" Mar 19 16:52:05 crc kubenswrapper[4918]: I0319 16:52:05.469203 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-nrk5x" podStartSLOduration=2.816432514 podStartE2EDuration="9.469186132s" podCreationTimestamp="2026-03-19 16:51:56 +0000 UTC" firstStartedPulling="2026-03-19 16:51:58.626644286 +0000 UTC m=+730.748843534" lastFinishedPulling="2026-03-19 16:52:05.279397894 +0000 UTC m=+737.401597152" observedRunningTime="2026-03-19 16:52:05.460393151 +0000 UTC m=+737.582592409" watchObservedRunningTime="2026-03-19 16:52:05.469186132 +0000 UTC m=+737.591385390" Mar 19 16:52:06 crc kubenswrapper[4918]: I0319 16:52:06.664174 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565652-wcs4m" Mar 19 16:52:06 crc kubenswrapper[4918]: I0319 16:52:06.773225 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc2x4\" (UniqueName: \"kubernetes.io/projected/31ec09cf-288d-4ac1-ab4a-9027d53ae433-kube-api-access-zc2x4\") pod \"31ec09cf-288d-4ac1-ab4a-9027d53ae433\" (UID: \"31ec09cf-288d-4ac1-ab4a-9027d53ae433\") " Mar 19 16:52:06 crc kubenswrapper[4918]: I0319 16:52:06.778486 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31ec09cf-288d-4ac1-ab4a-9027d53ae433-kube-api-access-zc2x4" (OuterVolumeSpecName: "kube-api-access-zc2x4") pod "31ec09cf-288d-4ac1-ab4a-9027d53ae433" (UID: "31ec09cf-288d-4ac1-ab4a-9027d53ae433"). InnerVolumeSpecName "kube-api-access-zc2x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:52:06 crc kubenswrapper[4918]: I0319 16:52:06.874353 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc2x4\" (UniqueName: \"kubernetes.io/projected/31ec09cf-288d-4ac1-ab4a-9027d53ae433-kube-api-access-zc2x4\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:07 crc kubenswrapper[4918]: I0319 16:52:07.486809 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565652-wcs4m" event={"ID":"31ec09cf-288d-4ac1-ab4a-9027d53ae433","Type":"ContainerDied","Data":"4d8e40d021538fd749ffa988077af19335ff82ecd92e5fce92a8d8f0d1a33706"} Mar 19 16:52:07 crc kubenswrapper[4918]: I0319 16:52:07.486882 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d8e40d021538fd749ffa988077af19335ff82ecd92e5fce92a8d8f0d1a33706" Mar 19 16:52:07 crc kubenswrapper[4918]: I0319 16:52:07.486976 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565652-wcs4m" Mar 19 16:52:07 crc kubenswrapper[4918]: I0319 16:52:07.515731 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565646-fxttn"] Mar 19 16:52:07 crc kubenswrapper[4918]: I0319 16:52:07.520839 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565646-fxttn"] Mar 19 16:52:08 crc kubenswrapper[4918]: I0319 16:52:08.595464 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6426340-a102-45d1-b1d8-0b347430c764" path="/var/lib/kubelet/pods/a6426340-a102-45d1-b1d8-0b347430c764/volumes" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.124857 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gtxsp"] Mar 19 16:52:09 crc kubenswrapper[4918]: E0319 16:52:09.125617 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ec09cf-288d-4ac1-ab4a-9027d53ae433" containerName="oc" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.125659 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ec09cf-288d-4ac1-ab4a-9027d53ae433" containerName="oc" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.125978 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="31ec09cf-288d-4ac1-ab4a-9027d53ae433" containerName="oc" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.127589 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.144613 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gtxsp"] Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.207035 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-catalog-content\") pod \"redhat-operators-gtxsp\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.207210 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cts7x\" (UniqueName: \"kubernetes.io/projected/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-kube-api-access-cts7x\") pod \"redhat-operators-gtxsp\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.207266 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-utilities\") pod \"redhat-operators-gtxsp\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.308850 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-catalog-content\") pod \"redhat-operators-gtxsp\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.309033 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cts7x\" (UniqueName: \"kubernetes.io/projected/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-kube-api-access-cts7x\") pod \"redhat-operators-gtxsp\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.309144 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-utilities\") pod \"redhat-operators-gtxsp\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.309315 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-catalog-content\") pod \"redhat-operators-gtxsp\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.309923 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-utilities\") pod \"redhat-operators-gtxsp\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.333469 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cts7x\" (UniqueName: \"kubernetes.io/projected/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-kube-api-access-cts7x\") pod \"redhat-operators-gtxsp\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.460004 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:09 crc kubenswrapper[4918]: I0319 16:52:09.676478 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gtxsp"] Mar 19 16:52:09 crc kubenswrapper[4918]: W0319 16:52:09.681110 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8c8a9ac_a39f_42c8_8d3b_57a6f98dbd6e.slice/crio-4805ea568f1a4efddd99b71827599fe001ee844239da31251f3bffba85d82baf WatchSource:0}: Error finding container 4805ea568f1a4efddd99b71827599fe001ee844239da31251f3bffba85d82baf: Status 404 returned error can't find the container with id 4805ea568f1a4efddd99b71827599fe001ee844239da31251f3bffba85d82baf Mar 19 16:52:10 crc kubenswrapper[4918]: I0319 16:52:10.510152 4918 generic.go:334] "Generic (PLEG): container finished" podID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerID="27ce97592e296af9484336e2b43e7b3d068a5b7897328ec289a6ee3e5c9c1440" exitCode=0 Mar 19 16:52:10 crc kubenswrapper[4918]: I0319 16:52:10.510208 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtxsp" event={"ID":"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e","Type":"ContainerDied","Data":"27ce97592e296af9484336e2b43e7b3d068a5b7897328ec289a6ee3e5c9c1440"} Mar 19 16:52:10 crc kubenswrapper[4918]: I0319 16:52:10.510259 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtxsp" event={"ID":"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e","Type":"ContainerStarted","Data":"4805ea568f1a4efddd99b71827599fe001ee844239da31251f3bffba85d82baf"} Mar 19 16:52:12 crc kubenswrapper[4918]: I0319 16:52:12.299374 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7t76p"] Mar 19 16:52:12 crc kubenswrapper[4918]: I0319 16:52:12.302114 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:12 crc kubenswrapper[4918]: I0319 16:52:12.314294 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7t76p"] Mar 19 16:52:12 crc kubenswrapper[4918]: I0319 16:52:12.943056 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-catalog-content\") pod \"certified-operators-7t76p\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:12 crc kubenswrapper[4918]: I0319 16:52:12.943113 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bgr5\" (UniqueName: \"kubernetes.io/projected/ca361601-de39-4563-9bcc-48aad3738c32-kube-api-access-2bgr5\") pod \"certified-operators-7t76p\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:12 crc kubenswrapper[4918]: I0319 16:52:12.943183 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-utilities\") pod \"certified-operators-7t76p\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.050171 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bgr5\" (UniqueName: \"kubernetes.io/projected/ca361601-de39-4563-9bcc-48aad3738c32-kube-api-access-2bgr5\") pod \"certified-operators-7t76p\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.050310 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-utilities\") pod \"certified-operators-7t76p\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.050368 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-catalog-content\") pod \"certified-operators-7t76p\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.050886 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-catalog-content\") pod \"certified-operators-7t76p\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.051810 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-utilities\") pod \"certified-operators-7t76p\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.088497 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bgr5\" (UniqueName: \"kubernetes.io/projected/ca361601-de39-4563-9bcc-48aad3738c32-kube-api-access-2bgr5\") pod \"certified-operators-7t76p\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.230036 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-6frc2" Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.288471 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.519772 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7t76p"] Mar 19 16:52:13 crc kubenswrapper[4918]: W0319 16:52:13.528973 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca361601_de39_4563_9bcc_48aad3738c32.slice/crio-6139ca255859f05a8cf237e08c195f518740b539607dc6b1286ba51e4f5610bd WatchSource:0}: Error finding container 6139ca255859f05a8cf237e08c195f518740b539607dc6b1286ba51e4f5610bd: Status 404 returned error can't find the container with id 6139ca255859f05a8cf237e08c195f518740b539607dc6b1286ba51e4f5610bd Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.986212 4918 generic.go:334] "Generic (PLEG): container finished" podID="ca361601-de39-4563-9bcc-48aad3738c32" containerID="5d68f839765958470f6b02c764f621bd5a1a4d9988e0c703e388546e4d6ba832" exitCode=0 Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.986322 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t76p" event={"ID":"ca361601-de39-4563-9bcc-48aad3738c32","Type":"ContainerDied","Data":"5d68f839765958470f6b02c764f621bd5a1a4d9988e0c703e388546e4d6ba832"} Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.986495 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t76p" event={"ID":"ca361601-de39-4563-9bcc-48aad3738c32","Type":"ContainerStarted","Data":"6139ca255859f05a8cf237e08c195f518740b539607dc6b1286ba51e4f5610bd"} Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.988406 4918 generic.go:334] "Generic (PLEG): container finished" podID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerID="ce85fc3848e7989140765edb17bf543249a124e7f6363825f1b97537215332d9" exitCode=0 Mar 19 16:52:13 crc kubenswrapper[4918]: I0319 16:52:13.988441 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtxsp" event={"ID":"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e","Type":"ContainerDied","Data":"ce85fc3848e7989140765edb17bf543249a124e7f6363825f1b97537215332d9"} Mar 19 16:52:14 crc kubenswrapper[4918]: I0319 16:52:14.995643 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtxsp" event={"ID":"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e","Type":"ContainerStarted","Data":"5d0124608642287b54131dd30122bba55c9a073407f17072ceb1186732c4811d"} Mar 19 16:52:14 crc kubenswrapper[4918]: I0319 16:52:14.997221 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t76p" event={"ID":"ca361601-de39-4563-9bcc-48aad3738c32","Type":"ContainerStarted","Data":"3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8"} Mar 19 16:52:15 crc kubenswrapper[4918]: I0319 16:52:15.039071 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gtxsp" podStartSLOduration=2.181585686 podStartE2EDuration="6.039055128s" podCreationTimestamp="2026-03-19 16:52:09 +0000 UTC" firstStartedPulling="2026-03-19 16:52:10.511722026 +0000 UTC m=+742.633921274" lastFinishedPulling="2026-03-19 16:52:14.369191458 +0000 UTC m=+746.491390716" observedRunningTime="2026-03-19 16:52:15.027740719 +0000 UTC m=+747.149939967" watchObservedRunningTime="2026-03-19 16:52:15.039055128 +0000 UTC m=+747.161254376" Mar 19 16:52:16 crc kubenswrapper[4918]: I0319 16:52:16.007176 4918 generic.go:334] "Generic (PLEG): container finished" podID="ca361601-de39-4563-9bcc-48aad3738c32" containerID="3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8" exitCode=0 Mar 19 16:52:16 crc kubenswrapper[4918]: I0319 16:52:16.007236 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t76p" event={"ID":"ca361601-de39-4563-9bcc-48aad3738c32","Type":"ContainerDied","Data":"3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8"} Mar 19 16:52:17 crc kubenswrapper[4918]: I0319 16:52:17.014846 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t76p" event={"ID":"ca361601-de39-4563-9bcc-48aad3738c32","Type":"ContainerStarted","Data":"a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6"} Mar 19 16:52:17 crc kubenswrapper[4918]: I0319 16:52:17.041777 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7t76p" podStartSLOduration=2.526950887 podStartE2EDuration="5.041760571s" podCreationTimestamp="2026-03-19 16:52:12 +0000 UTC" firstStartedPulling="2026-03-19 16:52:13.987903082 +0000 UTC m=+746.110102340" lastFinishedPulling="2026-03-19 16:52:16.502712776 +0000 UTC m=+748.624912024" observedRunningTime="2026-03-19 16:52:17.039843649 +0000 UTC m=+749.162042897" watchObservedRunningTime="2026-03-19 16:52:17.041760571 +0000 UTC m=+749.163959809" Mar 19 16:52:19 crc kubenswrapper[4918]: I0319 16:52:19.461712 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:19 crc kubenswrapper[4918]: I0319 16:52:19.461806 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:20 crc kubenswrapper[4918]: I0319 16:52:20.540059 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gtxsp" podUID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerName="registry-server" probeResult="failure" output=< Mar 19 16:52:20 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 16:52:20 crc kubenswrapper[4918]: > Mar 19 16:52:23 crc kubenswrapper[4918]: I0319 16:52:23.288727 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:23 crc kubenswrapper[4918]: I0319 16:52:23.289231 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:23 crc kubenswrapper[4918]: I0319 16:52:23.345781 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:24 crc kubenswrapper[4918]: I0319 16:52:24.113156 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:24 crc kubenswrapper[4918]: I0319 16:52:24.157235 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7t76p"] Mar 19 16:52:26 crc kubenswrapper[4918]: I0319 16:52:26.082422 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7t76p" podUID="ca361601-de39-4563-9bcc-48aad3738c32" containerName="registry-server" containerID="cri-o://a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6" gracePeriod=2 Mar 19 16:52:26 crc kubenswrapper[4918]: I0319 16:52:26.500228 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:26 crc kubenswrapper[4918]: I0319 16:52:26.639566 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-catalog-content\") pod \"ca361601-de39-4563-9bcc-48aad3738c32\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " Mar 19 16:52:26 crc kubenswrapper[4918]: I0319 16:52:26.640325 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bgr5\" (UniqueName: \"kubernetes.io/projected/ca361601-de39-4563-9bcc-48aad3738c32-kube-api-access-2bgr5\") pod \"ca361601-de39-4563-9bcc-48aad3738c32\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " Mar 19 16:52:26 crc kubenswrapper[4918]: I0319 16:52:26.640667 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-utilities\") pod \"ca361601-de39-4563-9bcc-48aad3738c32\" (UID: \"ca361601-de39-4563-9bcc-48aad3738c32\") " Mar 19 16:52:26 crc kubenswrapper[4918]: I0319 16:52:26.641954 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-utilities" (OuterVolumeSpecName: "utilities") pod "ca361601-de39-4563-9bcc-48aad3738c32" (UID: "ca361601-de39-4563-9bcc-48aad3738c32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:52:26 crc kubenswrapper[4918]: I0319 16:52:26.650133 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca361601-de39-4563-9bcc-48aad3738c32-kube-api-access-2bgr5" (OuterVolumeSpecName: "kube-api-access-2bgr5") pod "ca361601-de39-4563-9bcc-48aad3738c32" (UID: "ca361601-de39-4563-9bcc-48aad3738c32"). InnerVolumeSpecName "kube-api-access-2bgr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:52:26 crc kubenswrapper[4918]: I0319 16:52:26.698945 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca361601-de39-4563-9bcc-48aad3738c32" (UID: "ca361601-de39-4563-9bcc-48aad3738c32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:52:26 crc kubenswrapper[4918]: I0319 16:52:26.742210 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:26 crc kubenswrapper[4918]: I0319 16:52:26.742239 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca361601-de39-4563-9bcc-48aad3738c32-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:26 crc kubenswrapper[4918]: I0319 16:52:26.742253 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bgr5\" (UniqueName: \"kubernetes.io/projected/ca361601-de39-4563-9bcc-48aad3738c32-kube-api-access-2bgr5\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.092461 4918 generic.go:334] "Generic (PLEG): container finished" podID="ca361601-de39-4563-9bcc-48aad3738c32" containerID="a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6" exitCode=0 Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.092559 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t76p" event={"ID":"ca361601-de39-4563-9bcc-48aad3738c32","Type":"ContainerDied","Data":"a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6"} Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.092584 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7t76p" Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.092607 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t76p" event={"ID":"ca361601-de39-4563-9bcc-48aad3738c32","Type":"ContainerDied","Data":"6139ca255859f05a8cf237e08c195f518740b539607dc6b1286ba51e4f5610bd"} Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.092642 4918 scope.go:117] "RemoveContainer" containerID="a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6" Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.126381 4918 scope.go:117] "RemoveContainer" containerID="3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8" Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.134008 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7t76p"] Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.143066 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7t76p"] Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.161610 4918 scope.go:117] "RemoveContainer" containerID="5d68f839765958470f6b02c764f621bd5a1a4d9988e0c703e388546e4d6ba832" Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.185424 4918 scope.go:117] "RemoveContainer" containerID="a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6" Mar 19 16:52:27 crc kubenswrapper[4918]: E0319 16:52:27.185861 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6\": container with ID starting with a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6 not found: ID does not exist" containerID="a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6" Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.185915 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6"} err="failed to get container status \"a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6\": rpc error: code = NotFound desc = could not find container \"a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6\": container with ID starting with a18d7063accd9029f3bc2adb11087de6e8e872dd48b44dc98f4d1ed2dcb29db6 not found: ID does not exist" Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.186118 4918 scope.go:117] "RemoveContainer" containerID="3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8" Mar 19 16:52:27 crc kubenswrapper[4918]: E0319 16:52:27.186419 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8\": container with ID starting with 3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8 not found: ID does not exist" containerID="3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8" Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.186473 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8"} err="failed to get container status \"3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8\": rpc error: code = NotFound desc = could not find container \"3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8\": container with ID starting with 3901ea7368056c999961420ab66cbad1f6c9dced7749c909410f0c36d3f769e8 not found: ID does not exist" Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.186515 4918 scope.go:117] "RemoveContainer" containerID="5d68f839765958470f6b02c764f621bd5a1a4d9988e0c703e388546e4d6ba832" Mar 19 16:52:27 crc kubenswrapper[4918]: E0319 16:52:27.186970 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d68f839765958470f6b02c764f621bd5a1a4d9988e0c703e388546e4d6ba832\": container with ID starting with 5d68f839765958470f6b02c764f621bd5a1a4d9988e0c703e388546e4d6ba832 not found: ID does not exist" containerID="5d68f839765958470f6b02c764f621bd5a1a4d9988e0c703e388546e4d6ba832" Mar 19 16:52:27 crc kubenswrapper[4918]: I0319 16:52:27.186999 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d68f839765958470f6b02c764f621bd5a1a4d9988e0c703e388546e4d6ba832"} err="failed to get container status \"5d68f839765958470f6b02c764f621bd5a1a4d9988e0c703e388546e4d6ba832\": rpc error: code = NotFound desc = could not find container \"5d68f839765958470f6b02c764f621bd5a1a4d9988e0c703e388546e4d6ba832\": container with ID starting with 5d68f839765958470f6b02c764f621bd5a1a4d9988e0c703e388546e4d6ba832 not found: ID does not exist" Mar 19 16:52:28 crc kubenswrapper[4918]: I0319 16:52:28.602772 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca361601-de39-4563-9bcc-48aad3738c32" path="/var/lib/kubelet/pods/ca361601-de39-4563-9bcc-48aad3738c32/volumes" Mar 19 16:52:28 crc kubenswrapper[4918]: I0319 16:52:28.996626 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fdpcq"] Mar 19 16:52:28 crc kubenswrapper[4918]: E0319 16:52:28.996903 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca361601-de39-4563-9bcc-48aad3738c32" containerName="registry-server" Mar 19 16:52:28 crc kubenswrapper[4918]: I0319 16:52:28.996920 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca361601-de39-4563-9bcc-48aad3738c32" containerName="registry-server" Mar 19 16:52:28 crc kubenswrapper[4918]: E0319 16:52:28.996934 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca361601-de39-4563-9bcc-48aad3738c32" containerName="extract-utilities" Mar 19 16:52:28 crc kubenswrapper[4918]: I0319 16:52:28.996942 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca361601-de39-4563-9bcc-48aad3738c32" containerName="extract-utilities" Mar 19 16:52:28 crc kubenswrapper[4918]: E0319 16:52:28.996966 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca361601-de39-4563-9bcc-48aad3738c32" containerName="extract-content" Mar 19 16:52:28 crc kubenswrapper[4918]: I0319 16:52:28.996974 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca361601-de39-4563-9bcc-48aad3738c32" containerName="extract-content" Mar 19 16:52:28 crc kubenswrapper[4918]: I0319 16:52:28.997087 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca361601-de39-4563-9bcc-48aad3738c32" containerName="registry-server" Mar 19 16:52:28 crc kubenswrapper[4918]: I0319 16:52:28.998044 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.020821 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fdpcq"] Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.177755 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-utilities\") pod \"community-operators-fdpcq\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.177807 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-catalog-content\") pod \"community-operators-fdpcq\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.178026 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2q52\" (UniqueName: \"kubernetes.io/projected/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-kube-api-access-w2q52\") pod \"community-operators-fdpcq\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.279820 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2q52\" (UniqueName: \"kubernetes.io/projected/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-kube-api-access-w2q52\") pod \"community-operators-fdpcq\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.279909 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-utilities\") pod \"community-operators-fdpcq\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.279924 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-catalog-content\") pod \"community-operators-fdpcq\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.280296 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-catalog-content\") pod \"community-operators-fdpcq\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.280404 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-utilities\") pod \"community-operators-fdpcq\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.302571 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2q52\" (UniqueName: \"kubernetes.io/projected/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-kube-api-access-w2q52\") pod \"community-operators-fdpcq\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.332076 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.522580 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.577948 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:29 crc kubenswrapper[4918]: I0319 16:52:29.871895 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fdpcq"] Mar 19 16:52:30 crc kubenswrapper[4918]: I0319 16:52:30.117144 4918 generic.go:334] "Generic (PLEG): container finished" podID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" containerID="49358f3e361dcd1fa74fdb4ee5235477d5907700684cb4116445cbb8a2afb497" exitCode=0 Mar 19 16:52:30 crc kubenswrapper[4918]: I0319 16:52:30.117226 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdpcq" event={"ID":"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510","Type":"ContainerDied","Data":"49358f3e361dcd1fa74fdb4ee5235477d5907700684cb4116445cbb8a2afb497"} Mar 19 16:52:30 crc kubenswrapper[4918]: I0319 16:52:30.117299 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdpcq" event={"ID":"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510","Type":"ContainerStarted","Data":"d6bca133288a55d2625d5b846135b0e498e32c6abf21729f158631977c3a9eba"} Mar 19 16:52:31 crc kubenswrapper[4918]: I0319 16:52:31.125425 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdpcq" event={"ID":"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510","Type":"ContainerStarted","Data":"66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67"} Mar 19 16:52:31 crc kubenswrapper[4918]: I0319 16:52:31.792788 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gtxsp"] Mar 19 16:52:31 crc kubenswrapper[4918]: I0319 16:52:31.793058 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gtxsp" podUID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerName="registry-server" containerID="cri-o://5d0124608642287b54131dd30122bba55c9a073407f17072ceb1186732c4811d" gracePeriod=2 Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.141141 4918 generic.go:334] "Generic (PLEG): container finished" podID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerID="5d0124608642287b54131dd30122bba55c9a073407f17072ceb1186732c4811d" exitCode=0 Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.141304 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtxsp" event={"ID":"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e","Type":"ContainerDied","Data":"5d0124608642287b54131dd30122bba55c9a073407f17072ceb1186732c4811d"} Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.145802 4918 generic.go:334] "Generic (PLEG): container finished" podID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" containerID="66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67" exitCode=0 Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.145878 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdpcq" event={"ID":"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510","Type":"ContainerDied","Data":"66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67"} Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.306980 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.325858 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-utilities\") pod \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.325972 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cts7x\" (UniqueName: \"kubernetes.io/projected/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-kube-api-access-cts7x\") pod \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.326021 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-catalog-content\") pod \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\" (UID: \"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e\") " Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.328923 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-utilities" (OuterVolumeSpecName: "utilities") pod "c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" (UID: "c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.353695 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-kube-api-access-cts7x" (OuterVolumeSpecName: "kube-api-access-cts7x") pod "c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" (UID: "c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e"). InnerVolumeSpecName "kube-api-access-cts7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.427335 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.427390 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cts7x\" (UniqueName: \"kubernetes.io/projected/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-kube-api-access-cts7x\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.467974 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" (UID: "c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:52:32 crc kubenswrapper[4918]: I0319 16:52:32.528445 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:33 crc kubenswrapper[4918]: I0319 16:52:33.163874 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtxsp" event={"ID":"c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e","Type":"ContainerDied","Data":"4805ea568f1a4efddd99b71827599fe001ee844239da31251f3bffba85d82baf"} Mar 19 16:52:33 crc kubenswrapper[4918]: I0319 16:52:33.163938 4918 scope.go:117] "RemoveContainer" containerID="5d0124608642287b54131dd30122bba55c9a073407f17072ceb1186732c4811d" Mar 19 16:52:33 crc kubenswrapper[4918]: I0319 16:52:33.164066 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtxsp" Mar 19 16:52:33 crc kubenswrapper[4918]: I0319 16:52:33.169127 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdpcq" event={"ID":"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510","Type":"ContainerStarted","Data":"690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd"} Mar 19 16:52:33 crc kubenswrapper[4918]: I0319 16:52:33.197440 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fdpcq" podStartSLOduration=2.752380682 podStartE2EDuration="5.197411394s" podCreationTimestamp="2026-03-19 16:52:28 +0000 UTC" firstStartedPulling="2026-03-19 16:52:30.119420159 +0000 UTC m=+762.241619417" lastFinishedPulling="2026-03-19 16:52:32.564450851 +0000 UTC m=+764.686650129" observedRunningTime="2026-03-19 16:52:33.193200839 +0000 UTC m=+765.315400097" watchObservedRunningTime="2026-03-19 16:52:33.197411394 +0000 UTC m=+765.319610682" Mar 19 16:52:33 crc kubenswrapper[4918]: I0319 16:52:33.199735 4918 scope.go:117] "RemoveContainer" containerID="ce85fc3848e7989140765edb17bf543249a124e7f6363825f1b97537215332d9" Mar 19 16:52:33 crc kubenswrapper[4918]: I0319 16:52:33.219480 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gtxsp"] Mar 19 16:52:33 crc kubenswrapper[4918]: I0319 16:52:33.226971 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gtxsp"] Mar 19 16:52:33 crc kubenswrapper[4918]: I0319 16:52:33.239090 4918 scope.go:117] "RemoveContainer" containerID="27ce97592e296af9484336e2b43e7b3d068a5b7897328ec289a6ee3e5c9c1440" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.200672 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlb4"] Mar 19 16:52:34 crc kubenswrapper[4918]: E0319 16:52:34.201108 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerName="extract-content" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.201131 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerName="extract-content" Mar 19 16:52:34 crc kubenswrapper[4918]: E0319 16:52:34.201151 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerName="registry-server" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.201160 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerName="registry-server" Mar 19 16:52:34 crc kubenswrapper[4918]: E0319 16:52:34.201180 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerName="extract-utilities" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.201189 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerName="extract-utilities" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.201340 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" containerName="registry-server" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.208256 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.209870 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlb4"] Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.352580 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-utilities\") pod \"redhat-marketplace-vdlb4\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.352893 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-catalog-content\") pod \"redhat-marketplace-vdlb4\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.353054 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7fx\" (UniqueName: \"kubernetes.io/projected/e9b003c0-f449-49ae-8ff2-882a0a350671-kube-api-access-xc7fx\") pod \"redhat-marketplace-vdlb4\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.454554 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7fx\" (UniqueName: \"kubernetes.io/projected/e9b003c0-f449-49ae-8ff2-882a0a350671-kube-api-access-xc7fx\") pod \"redhat-marketplace-vdlb4\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.454688 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-utilities\") pod \"redhat-marketplace-vdlb4\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.454733 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-catalog-content\") pod \"redhat-marketplace-vdlb4\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.455421 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-utilities\") pod \"redhat-marketplace-vdlb4\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.455486 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-catalog-content\") pod \"redhat-marketplace-vdlb4\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.482125 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7fx\" (UniqueName: \"kubernetes.io/projected/e9b003c0-f449-49ae-8ff2-882a0a350671-kube-api-access-xc7fx\") pod \"redhat-marketplace-vdlb4\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.529405 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.599914 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e" path="/var/lib/kubelet/pods/c8c8a9ac-a39f-42c8-8d3b-57a6f98dbd6e/volumes" Mar 19 16:52:34 crc kubenswrapper[4918]: I0319 16:52:34.807809 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlb4"] Mar 19 16:52:35 crc kubenswrapper[4918]: I0319 16:52:35.184619 4918 generic.go:334] "Generic (PLEG): container finished" podID="e9b003c0-f449-49ae-8ff2-882a0a350671" containerID="b8a673761a6b0c6a98d8e2ad1a4ed95b999baa6b2cd75a5e5e6fb6a8eb04e0f1" exitCode=0 Mar 19 16:52:35 crc kubenswrapper[4918]: I0319 16:52:35.184702 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlb4" event={"ID":"e9b003c0-f449-49ae-8ff2-882a0a350671","Type":"ContainerDied","Data":"b8a673761a6b0c6a98d8e2ad1a4ed95b999baa6b2cd75a5e5e6fb6a8eb04e0f1"} Mar 19 16:52:35 crc kubenswrapper[4918]: I0319 16:52:35.185001 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlb4" event={"ID":"e9b003c0-f449-49ae-8ff2-882a0a350671","Type":"ContainerStarted","Data":"f4109d194c7824adfa3bef139424fafb49efa821acd6510fb8a1ba3e91bbcdc6"} Mar 19 16:52:37 crc kubenswrapper[4918]: I0319 16:52:37.223610 4918 generic.go:334] "Generic (PLEG): container finished" podID="e9b003c0-f449-49ae-8ff2-882a0a350671" containerID="447c9b02714429c08ad5e316d200e311f24ac87d4bdccd5f8a7692e2406275c9" exitCode=0 Mar 19 16:52:37 crc kubenswrapper[4918]: I0319 16:52:37.223694 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlb4" event={"ID":"e9b003c0-f449-49ae-8ff2-882a0a350671","Type":"ContainerDied","Data":"447c9b02714429c08ad5e316d200e311f24ac87d4bdccd5f8a7692e2406275c9"} Mar 19 16:52:38 crc kubenswrapper[4918]: I0319 16:52:38.236333 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlb4" event={"ID":"e9b003c0-f449-49ae-8ff2-882a0a350671","Type":"ContainerStarted","Data":"e2420b9d880c1419b3a26ef4a96c14ec327d34596eaa8ccc5a13c87ec4f272c6"} Mar 19 16:52:38 crc kubenswrapper[4918]: I0319 16:52:38.268866 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vdlb4" podStartSLOduration=1.556671992 podStartE2EDuration="4.268843048s" podCreationTimestamp="2026-03-19 16:52:34 +0000 UTC" firstStartedPulling="2026-03-19 16:52:35.186334379 +0000 UTC m=+767.308533627" lastFinishedPulling="2026-03-19 16:52:37.898505425 +0000 UTC m=+770.020704683" observedRunningTime="2026-03-19 16:52:38.262890495 +0000 UTC m=+770.385089753" watchObservedRunningTime="2026-03-19 16:52:38.268843048 +0000 UTC m=+770.391042326" Mar 19 16:52:39 crc kubenswrapper[4918]: I0319 16:52:39.332623 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:39 crc kubenswrapper[4918]: I0319 16:52:39.332879 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:39 crc kubenswrapper[4918]: I0319 16:52:39.394690 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:40 crc kubenswrapper[4918]: I0319 16:52:40.321780 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:41 crc kubenswrapper[4918]: I0319 16:52:41.590040 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fdpcq"] Mar 19 16:52:42 crc kubenswrapper[4918]: I0319 16:52:42.264231 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fdpcq" podUID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" containerName="registry-server" containerID="cri-o://690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd" gracePeriod=2 Mar 19 16:52:42 crc kubenswrapper[4918]: I0319 16:52:42.622751 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:42 crc kubenswrapper[4918]: I0319 16:52:42.774224 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-utilities\") pod \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " Mar 19 16:52:42 crc kubenswrapper[4918]: I0319 16:52:42.774389 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2q52\" (UniqueName: \"kubernetes.io/projected/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-kube-api-access-w2q52\") pod \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " Mar 19 16:52:42 crc kubenswrapper[4918]: I0319 16:52:42.774716 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-catalog-content\") pod \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\" (UID: \"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510\") " Mar 19 16:52:42 crc kubenswrapper[4918]: I0319 16:52:42.775626 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-utilities" (OuterVolumeSpecName: "utilities") pod "6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" (UID: "6ec66dca-e06d-4ebc-9b41-64cfa3b2c510"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:52:42 crc kubenswrapper[4918]: I0319 16:52:42.784718 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-kube-api-access-w2q52" (OuterVolumeSpecName: "kube-api-access-w2q52") pod "6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" (UID: "6ec66dca-e06d-4ebc-9b41-64cfa3b2c510"). InnerVolumeSpecName "kube-api-access-w2q52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:52:42 crc kubenswrapper[4918]: I0319 16:52:42.844973 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" (UID: "6ec66dca-e06d-4ebc-9b41-64cfa3b2c510"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:52:42 crc kubenswrapper[4918]: I0319 16:52:42.876110 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:42 crc kubenswrapper[4918]: I0319 16:52:42.876149 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:42 crc kubenswrapper[4918]: I0319 16:52:42.876162 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2q52\" (UniqueName: \"kubernetes.io/projected/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510-kube-api-access-w2q52\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.040075 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7"] Mar 19 16:52:43 crc kubenswrapper[4918]: E0319 16:52:43.040289 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" containerName="extract-content" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.040301 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" containerName="extract-content" Mar 19 16:52:43 crc kubenswrapper[4918]: E0319 16:52:43.040312 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" containerName="registry-server" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.040318 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" containerName="registry-server" Mar 19 16:52:43 crc kubenswrapper[4918]: E0319 16:52:43.040329 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" containerName="extract-utilities" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.040337 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" containerName="extract-utilities" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.040456 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" containerName="registry-server" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.041291 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.043477 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.051799 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7"] Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.179701 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stbw7\" (UniqueName: \"kubernetes.io/projected/a4b4c7cb-18de-4527-a4b5-859f45243567-kube-api-access-stbw7\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.180067 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.180118 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.271891 4918 generic.go:334] "Generic (PLEG): container finished" podID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" containerID="690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd" exitCode=0 Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.271941 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdpcq" event={"ID":"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510","Type":"ContainerDied","Data":"690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd"} Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.271978 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdpcq" event={"ID":"6ec66dca-e06d-4ebc-9b41-64cfa3b2c510","Type":"ContainerDied","Data":"d6bca133288a55d2625d5b846135b0e498e32c6abf21729f158631977c3a9eba"} Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.271978 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdpcq" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.271995 4918 scope.go:117] "RemoveContainer" containerID="690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.281265 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.281309 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.281384 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stbw7\" (UniqueName: \"kubernetes.io/projected/a4b4c7cb-18de-4527-a4b5-859f45243567-kube-api-access-stbw7\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.281809 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-bundle\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.281815 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-util\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.290733 4918 scope.go:117] "RemoveContainer" containerID="66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.311346 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stbw7\" (UniqueName: \"kubernetes.io/projected/a4b4c7cb-18de-4527-a4b5-859f45243567-kube-api-access-stbw7\") pod \"b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.312055 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fdpcq"] Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.319868 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fdpcq"] Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.320468 4918 scope.go:117] "RemoveContainer" containerID="49358f3e361dcd1fa74fdb4ee5235477d5907700684cb4116445cbb8a2afb497" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.332928 4918 scope.go:117] "RemoveContainer" containerID="690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd" Mar 19 16:52:43 crc kubenswrapper[4918]: E0319 16:52:43.333314 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd\": container with ID starting with 690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd not found: ID does not exist" containerID="690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.333345 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd"} err="failed to get container status \"690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd\": rpc error: code = NotFound desc = could not find container \"690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd\": container with ID starting with 690106b36f0a24a4f1d313c3758b659f836ed030d37d26aeac882ce678dd1fcd not found: ID does not exist" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.333364 4918 scope.go:117] "RemoveContainer" containerID="66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67" Mar 19 16:52:43 crc kubenswrapper[4918]: E0319 16:52:43.334062 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67\": container with ID starting with 66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67 not found: ID does not exist" containerID="66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.334088 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67"} err="failed to get container status \"66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67\": rpc error: code = NotFound desc = could not find container \"66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67\": container with ID starting with 66a562ce1ab6041381adee464bb5bcea6df83631220a6667a01b4c9961783b67 not found: ID does not exist" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.334104 4918 scope.go:117] "RemoveContainer" containerID="49358f3e361dcd1fa74fdb4ee5235477d5907700684cb4116445cbb8a2afb497" Mar 19 16:52:43 crc kubenswrapper[4918]: E0319 16:52:43.334377 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49358f3e361dcd1fa74fdb4ee5235477d5907700684cb4116445cbb8a2afb497\": container with ID starting with 49358f3e361dcd1fa74fdb4ee5235477d5907700684cb4116445cbb8a2afb497 not found: ID does not exist" containerID="49358f3e361dcd1fa74fdb4ee5235477d5907700684cb4116445cbb8a2afb497" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.334401 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49358f3e361dcd1fa74fdb4ee5235477d5907700684cb4116445cbb8a2afb497"} err="failed to get container status \"49358f3e361dcd1fa74fdb4ee5235477d5907700684cb4116445cbb8a2afb497\": rpc error: code = NotFound desc = could not find container \"49358f3e361dcd1fa74fdb4ee5235477d5907700684cb4116445cbb8a2afb497\": container with ID starting with 49358f3e361dcd1fa74fdb4ee5235477d5907700684cb4116445cbb8a2afb497 not found: ID does not exist" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.357185 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:43 crc kubenswrapper[4918]: I0319 16:52:43.755173 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7"] Mar 19 16:52:44 crc kubenswrapper[4918]: I0319 16:52:44.280428 4918 generic.go:334] "Generic (PLEG): container finished" podID="a4b4c7cb-18de-4527-a4b5-859f45243567" containerID="a8f0dcacfd3e025bb5af1d61565c152bce0d946bd7c6ce53470ae5d16d50f972" exitCode=0 Mar 19 16:52:44 crc kubenswrapper[4918]: I0319 16:52:44.280470 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" event={"ID":"a4b4c7cb-18de-4527-a4b5-859f45243567","Type":"ContainerDied","Data":"a8f0dcacfd3e025bb5af1d61565c152bce0d946bd7c6ce53470ae5d16d50f972"} Mar 19 16:52:44 crc kubenswrapper[4918]: I0319 16:52:44.280509 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" event={"ID":"a4b4c7cb-18de-4527-a4b5-859f45243567","Type":"ContainerStarted","Data":"cb64e89476dc070ad234a1b51a3d7c4bc4cbd4d95af1737bbbdbafb799adea3c"} Mar 19 16:52:44 crc kubenswrapper[4918]: I0319 16:52:44.530119 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:44 crc kubenswrapper[4918]: I0319 16:52:44.530391 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:44 crc kubenswrapper[4918]: I0319 16:52:44.580930 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:44 crc kubenswrapper[4918]: I0319 16:52:44.596466 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec66dca-e06d-4ebc-9b41-64cfa3b2c510" path="/var/lib/kubelet/pods/6ec66dca-e06d-4ebc-9b41-64cfa3b2c510/volumes" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.330345 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.350904 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.351845 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.353872 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.354952 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.355262 4918 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-6rq8w" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.360765 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.510855 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6f077045-6af8-4dcc-9ecd-f88c66389423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6f077045-6af8-4dcc-9ecd-f88c66389423\") pod \"minio\" (UID: \"0dd10691-e214-46f3-a401-564b36ad0343\") " pod="minio-dev/minio" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.511289 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj7vq\" (UniqueName: \"kubernetes.io/projected/0dd10691-e214-46f3-a401-564b36ad0343-kube-api-access-cj7vq\") pod \"minio\" (UID: \"0dd10691-e214-46f3-a401-564b36ad0343\") " pod="minio-dev/minio" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.612731 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6f077045-6af8-4dcc-9ecd-f88c66389423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6f077045-6af8-4dcc-9ecd-f88c66389423\") pod \"minio\" (UID: \"0dd10691-e214-46f3-a401-564b36ad0343\") " pod="minio-dev/minio" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.612804 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj7vq\" (UniqueName: \"kubernetes.io/projected/0dd10691-e214-46f3-a401-564b36ad0343-kube-api-access-cj7vq\") pod \"minio\" (UID: \"0dd10691-e214-46f3-a401-564b36ad0343\") " pod="minio-dev/minio" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.616424 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.616480 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6f077045-6af8-4dcc-9ecd-f88c66389423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6f077045-6af8-4dcc-9ecd-f88c66389423\") pod \"minio\" (UID: \"0dd10691-e214-46f3-a401-564b36ad0343\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c2f1325de3c8e348afccb5eb63fa6fd0f7d725c77b9f91125b3f1eeae05e3f92/globalmount\"" pod="minio-dev/minio" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.631506 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj7vq\" (UniqueName: \"kubernetes.io/projected/0dd10691-e214-46f3-a401-564b36ad0343-kube-api-access-cj7vq\") pod \"minio\" (UID: \"0dd10691-e214-46f3-a401-564b36ad0343\") " pod="minio-dev/minio" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.650918 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6f077045-6af8-4dcc-9ecd-f88c66389423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6f077045-6af8-4dcc-9ecd-f88c66389423\") pod \"minio\" (UID: \"0dd10691-e214-46f3-a401-564b36ad0343\") " pod="minio-dev/minio" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.671957 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 19 16:52:45 crc kubenswrapper[4918]: I0319 16:52:45.848060 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 19 16:52:45 crc kubenswrapper[4918]: W0319 16:52:45.856583 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dd10691_e214_46f3_a401_564b36ad0343.slice/crio-83488b0d8bee2973e0847285d4c09c1dc006e03f4bf44b6d173772649634009d WatchSource:0}: Error finding container 83488b0d8bee2973e0847285d4c09c1dc006e03f4bf44b6d173772649634009d: Status 404 returned error can't find the container with id 83488b0d8bee2973e0847285d4c09c1dc006e03f4bf44b6d173772649634009d Mar 19 16:52:46 crc kubenswrapper[4918]: I0319 16:52:46.297298 4918 generic.go:334] "Generic (PLEG): container finished" podID="a4b4c7cb-18de-4527-a4b5-859f45243567" containerID="aa9b9c20c53b29e5cac1fbbe4046df5220e1b906973a375cc41ea2abfbeac4e9" exitCode=0 Mar 19 16:52:46 crc kubenswrapper[4918]: I0319 16:52:46.297405 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" event={"ID":"a4b4c7cb-18de-4527-a4b5-859f45243567","Type":"ContainerDied","Data":"aa9b9c20c53b29e5cac1fbbe4046df5220e1b906973a375cc41ea2abfbeac4e9"} Mar 19 16:52:46 crc kubenswrapper[4918]: I0319 16:52:46.300572 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"0dd10691-e214-46f3-a401-564b36ad0343","Type":"ContainerStarted","Data":"83488b0d8bee2973e0847285d4c09c1dc006e03f4bf44b6d173772649634009d"} Mar 19 16:52:47 crc kubenswrapper[4918]: I0319 16:52:47.324548 4918 generic.go:334] "Generic (PLEG): container finished" podID="a4b4c7cb-18de-4527-a4b5-859f45243567" containerID="d620c5fb76de8a68cc2351848c33e17be6f47df9421314c027ad4e22bc8f11d6" exitCode=0 Mar 19 16:52:47 crc kubenswrapper[4918]: I0319 16:52:47.324656 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" event={"ID":"a4b4c7cb-18de-4527-a4b5-859f45243567","Type":"ContainerDied","Data":"d620c5fb76de8a68cc2351848c33e17be6f47df9421314c027ad4e22bc8f11d6"} Mar 19 16:52:49 crc kubenswrapper[4918]: I0319 16:52:49.634153 4918 scope.go:117] "RemoveContainer" containerID="2333a985f057bcdb5f4dc1c789cdd72e1fbd69cd342e6ba9b32b4c0342e5d040" Mar 19 16:52:49 crc kubenswrapper[4918]: I0319 16:52:49.852140 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:49 crc kubenswrapper[4918]: I0319 16:52:49.986707 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlb4"] Mar 19 16:52:49 crc kubenswrapper[4918]: I0319 16:52:49.986916 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vdlb4" podUID="e9b003c0-f449-49ae-8ff2-882a0a350671" containerName="registry-server" containerID="cri-o://e2420b9d880c1419b3a26ef4a96c14ec327d34596eaa8ccc5a13c87ec4f272c6" gracePeriod=2 Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.004453 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stbw7\" (UniqueName: \"kubernetes.io/projected/a4b4c7cb-18de-4527-a4b5-859f45243567-kube-api-access-stbw7\") pod \"a4b4c7cb-18de-4527-a4b5-859f45243567\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.004843 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-util\") pod \"a4b4c7cb-18de-4527-a4b5-859f45243567\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.004930 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-bundle\") pod \"a4b4c7cb-18de-4527-a4b5-859f45243567\" (UID: \"a4b4c7cb-18de-4527-a4b5-859f45243567\") " Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.006298 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-bundle" (OuterVolumeSpecName: "bundle") pod "a4b4c7cb-18de-4527-a4b5-859f45243567" (UID: "a4b4c7cb-18de-4527-a4b5-859f45243567"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.012769 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b4c7cb-18de-4527-a4b5-859f45243567-kube-api-access-stbw7" (OuterVolumeSpecName: "kube-api-access-stbw7") pod "a4b4c7cb-18de-4527-a4b5-859f45243567" (UID: "a4b4c7cb-18de-4527-a4b5-859f45243567"). InnerVolumeSpecName "kube-api-access-stbw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.020398 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-util" (OuterVolumeSpecName: "util") pod "a4b4c7cb-18de-4527-a4b5-859f45243567" (UID: "a4b4c7cb-18de-4527-a4b5-859f45243567"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.107252 4918 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.107289 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stbw7\" (UniqueName: \"kubernetes.io/projected/a4b4c7cb-18de-4527-a4b5-859f45243567-kube-api-access-stbw7\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.107300 4918 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4b4c7cb-18de-4527-a4b5-859f45243567-util\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.346238 4918 generic.go:334] "Generic (PLEG): container finished" podID="e9b003c0-f449-49ae-8ff2-882a0a350671" containerID="e2420b9d880c1419b3a26ef4a96c14ec327d34596eaa8ccc5a13c87ec4f272c6" exitCode=0 Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.346324 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlb4" event={"ID":"e9b003c0-f449-49ae-8ff2-882a0a350671","Type":"ContainerDied","Data":"e2420b9d880c1419b3a26ef4a96c14ec327d34596eaa8ccc5a13c87ec4f272c6"} Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.348785 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" event={"ID":"a4b4c7cb-18de-4527-a4b5-859f45243567","Type":"ContainerDied","Data":"cb64e89476dc070ad234a1b51a3d7c4bc4cbd4d95af1737bbbdbafb799adea3c"} Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.348824 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb64e89476dc070ad234a1b51a3d7c4bc4cbd4d95af1737bbbdbafb799adea3c" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.348861 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.415853 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.615199 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-catalog-content\") pod \"e9b003c0-f449-49ae-8ff2-882a0a350671\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.615290 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc7fx\" (UniqueName: \"kubernetes.io/projected/e9b003c0-f449-49ae-8ff2-882a0a350671-kube-api-access-xc7fx\") pod \"e9b003c0-f449-49ae-8ff2-882a0a350671\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.615337 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-utilities\") pod \"e9b003c0-f449-49ae-8ff2-882a0a350671\" (UID: \"e9b003c0-f449-49ae-8ff2-882a0a350671\") " Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.616270 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-utilities" (OuterVolumeSpecName: "utilities") pod "e9b003c0-f449-49ae-8ff2-882a0a350671" (UID: "e9b003c0-f449-49ae-8ff2-882a0a350671"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.623280 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b003c0-f449-49ae-8ff2-882a0a350671-kube-api-access-xc7fx" (OuterVolumeSpecName: "kube-api-access-xc7fx") pod "e9b003c0-f449-49ae-8ff2-882a0a350671" (UID: "e9b003c0-f449-49ae-8ff2-882a0a350671"). InnerVolumeSpecName "kube-api-access-xc7fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.656687 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9b003c0-f449-49ae-8ff2-882a0a350671" (UID: "e9b003c0-f449-49ae-8ff2-882a0a350671"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.716474 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.716510 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc7fx\" (UniqueName: \"kubernetes.io/projected/e9b003c0-f449-49ae-8ff2-882a0a350671-kube-api-access-xc7fx\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:50 crc kubenswrapper[4918]: I0319 16:52:50.716546 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9b003c0-f449-49ae-8ff2-882a0a350671-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 16:52:51 crc kubenswrapper[4918]: I0319 16:52:51.360126 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdlb4" Mar 19 16:52:51 crc kubenswrapper[4918]: I0319 16:52:51.360125 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlb4" event={"ID":"e9b003c0-f449-49ae-8ff2-882a0a350671","Type":"ContainerDied","Data":"f4109d194c7824adfa3bef139424fafb49efa821acd6510fb8a1ba3e91bbcdc6"} Mar 19 16:52:51 crc kubenswrapper[4918]: I0319 16:52:51.360377 4918 scope.go:117] "RemoveContainer" containerID="e2420b9d880c1419b3a26ef4a96c14ec327d34596eaa8ccc5a13c87ec4f272c6" Mar 19 16:52:51 crc kubenswrapper[4918]: I0319 16:52:51.362291 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"0dd10691-e214-46f3-a401-564b36ad0343","Type":"ContainerStarted","Data":"c85fb46be666319e58214649bde49db03a95d58d20caabdceada3f44cf9d64c8"} Mar 19 16:52:51 crc kubenswrapper[4918]: I0319 16:52:51.388847 4918 scope.go:117] "RemoveContainer" containerID="447c9b02714429c08ad5e316d200e311f24ac87d4bdccd5f8a7692e2406275c9" Mar 19 16:52:51 crc kubenswrapper[4918]: I0319 16:52:51.394177 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.9953397710000003 podStartE2EDuration="8.3941492s" podCreationTimestamp="2026-03-19 16:52:43 +0000 UTC" firstStartedPulling="2026-03-19 16:52:45.858820315 +0000 UTC m=+777.981019563" lastFinishedPulling="2026-03-19 16:52:50.257629744 +0000 UTC m=+782.379828992" observedRunningTime="2026-03-19 16:52:51.388409623 +0000 UTC m=+783.510608881" watchObservedRunningTime="2026-03-19 16:52:51.3941492 +0000 UTC m=+783.516348488" Mar 19 16:52:51 crc kubenswrapper[4918]: I0319 16:52:51.421587 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlb4"] Mar 19 16:52:51 crc kubenswrapper[4918]: I0319 16:52:51.427626 4918 scope.go:117] "RemoveContainer" containerID="b8a673761a6b0c6a98d8e2ad1a4ed95b999baa6b2cd75a5e5e6fb6a8eb04e0f1" Mar 19 16:52:51 crc kubenswrapper[4918]: I0319 16:52:51.432461 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlb4"] Mar 19 16:52:52 crc kubenswrapper[4918]: I0319 16:52:52.592213 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9b003c0-f449-49ae-8ff2-882a0a350671" path="/var/lib/kubelet/pods/e9b003c0-f449-49ae-8ff2-882a0a350671/volumes" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.523742 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk"] Mar 19 16:52:55 crc kubenswrapper[4918]: E0319 16:52:55.524240 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b4c7cb-18de-4527-a4b5-859f45243567" containerName="util" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.524253 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b4c7cb-18de-4527-a4b5-859f45243567" containerName="util" Mar 19 16:52:55 crc kubenswrapper[4918]: E0319 16:52:55.524268 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b4c7cb-18de-4527-a4b5-859f45243567" containerName="extract" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.524274 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b4c7cb-18de-4527-a4b5-859f45243567" containerName="extract" Mar 19 16:52:55 crc kubenswrapper[4918]: E0319 16:52:55.524282 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b003c0-f449-49ae-8ff2-882a0a350671" containerName="extract-content" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.524288 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b003c0-f449-49ae-8ff2-882a0a350671" containerName="extract-content" Mar 19 16:52:55 crc kubenswrapper[4918]: E0319 16:52:55.524299 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b003c0-f449-49ae-8ff2-882a0a350671" containerName="registry-server" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.524305 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b003c0-f449-49ae-8ff2-882a0a350671" containerName="registry-server" Mar 19 16:52:55 crc kubenswrapper[4918]: E0319 16:52:55.524312 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b003c0-f449-49ae-8ff2-882a0a350671" containerName="extract-utilities" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.524318 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b003c0-f449-49ae-8ff2-882a0a350671" containerName="extract-utilities" Mar 19 16:52:55 crc kubenswrapper[4918]: E0319 16:52:55.524326 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b4c7cb-18de-4527-a4b5-859f45243567" containerName="pull" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.524331 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b4c7cb-18de-4527-a4b5-859f45243567" containerName="pull" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.524446 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b4c7cb-18de-4527-a4b5-859f45243567" containerName="extract" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.524458 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b003c0-f449-49ae-8ff2-882a0a350671" containerName="registry-server" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.525125 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.530502 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.530611 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.533150 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-mgb2t" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.535308 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.536042 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.540374 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.558952 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk"] Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.679400 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-manager-config\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.679448 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hn52\" (UniqueName: \"kubernetes.io/projected/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-kube-api-access-8hn52\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.679473 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-apiservice-cert\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.679623 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.679681 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-webhook-cert\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.780465 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.780567 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-webhook-cert\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.780598 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-manager-config\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.780617 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hn52\" (UniqueName: \"kubernetes.io/projected/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-kube-api-access-8hn52\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.780635 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-apiservice-cert\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.782244 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-manager-config\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.788443 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.788918 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-webhook-cert\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.799791 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hn52\" (UniqueName: \"kubernetes.io/projected/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-kube-api-access-8hn52\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.804092 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c7a4c16-5343-41a6-a08a-e939ac44aa1a-apiservice-cert\") pod \"loki-operator-controller-manager-5d575bbbdc-jmzdk\" (UID: \"4c7a4c16-5343-41a6-a08a-e939ac44aa1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:55 crc kubenswrapper[4918]: I0319 16:52:55.842770 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:52:56 crc kubenswrapper[4918]: I0319 16:52:56.081217 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk"] Mar 19 16:52:56 crc kubenswrapper[4918]: W0319 16:52:56.088697 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c7a4c16_5343_41a6_a08a_e939ac44aa1a.slice/crio-e79058a0e51f9b8fb9718a7e6ff6ea12598137988ddd63cc608cce5ace0b0544 WatchSource:0}: Error finding container e79058a0e51f9b8fb9718a7e6ff6ea12598137988ddd63cc608cce5ace0b0544: Status 404 returned error can't find the container with id e79058a0e51f9b8fb9718a7e6ff6ea12598137988ddd63cc608cce5ace0b0544 Mar 19 16:52:56 crc kubenswrapper[4918]: I0319 16:52:56.393800 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" event={"ID":"4c7a4c16-5343-41a6-a08a-e939ac44aa1a","Type":"ContainerStarted","Data":"e79058a0e51f9b8fb9718a7e6ff6ea12598137988ddd63cc608cce5ace0b0544"} Mar 19 16:52:58 crc kubenswrapper[4918]: I0319 16:52:58.211831 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:52:58 crc kubenswrapper[4918]: I0319 16:52:58.212109 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:53:00 crc kubenswrapper[4918]: I0319 16:53:00.860295 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 16:53:01 crc kubenswrapper[4918]: I0319 16:53:01.439933 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" event={"ID":"4c7a4c16-5343-41a6-a08a-e939ac44aa1a","Type":"ContainerStarted","Data":"816fb39be3974a0069d779c7232c68917dc01328a34d037eb65eae932baf1a82"} Mar 19 16:53:08 crc kubenswrapper[4918]: I0319 16:53:08.485148 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" event={"ID":"4c7a4c16-5343-41a6-a08a-e939ac44aa1a","Type":"ContainerStarted","Data":"aa34172f82769ee38dc9bfdf920a5255042c7d1bbacfb09370018f5db9156f5d"} Mar 19 16:53:08 crc kubenswrapper[4918]: I0319 16:53:08.486063 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:53:08 crc kubenswrapper[4918]: I0319 16:53:08.490148 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" Mar 19 16:53:08 crc kubenswrapper[4918]: I0319 16:53:08.539557 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-5d575bbbdc-jmzdk" podStartSLOduration=1.90046419 podStartE2EDuration="13.539498276s" podCreationTimestamp="2026-03-19 16:52:55 +0000 UTC" firstStartedPulling="2026-03-19 16:52:56.090970124 +0000 UTC m=+788.213169372" lastFinishedPulling="2026-03-19 16:53:07.73000421 +0000 UTC m=+799.852203458" observedRunningTime="2026-03-19 16:53:08.522680317 +0000 UTC m=+800.644879655" watchObservedRunningTime="2026-03-19 16:53:08.539498276 +0000 UTC m=+800.661697564" Mar 19 16:53:28 crc kubenswrapper[4918]: I0319 16:53:28.212604 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:53:28 crc kubenswrapper[4918]: I0319 16:53:28.213105 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.705516 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn"] Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.707088 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.708830 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.722432 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn"] Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.823827 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.823921 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.823957 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6z28\" (UniqueName: \"kubernetes.io/projected/a33b8fae-e1c3-428a-b08e-1afb2e142412-kube-api-access-l6z28\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.925124 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.925643 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.925594 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.925718 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6z28\" (UniqueName: \"kubernetes.io/projected/a33b8fae-e1c3-428a-b08e-1afb2e142412-kube-api-access-l6z28\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.926256 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:40 crc kubenswrapper[4918]: I0319 16:53:40.968330 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6z28\" (UniqueName: \"kubernetes.io/projected/a33b8fae-e1c3-428a-b08e-1afb2e142412-kube-api-access-l6z28\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:41 crc kubenswrapper[4918]: I0319 16:53:41.028578 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:41 crc kubenswrapper[4918]: I0319 16:53:41.291285 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn"] Mar 19 16:53:41 crc kubenswrapper[4918]: I0319 16:53:41.766476 4918 generic.go:334] "Generic (PLEG): container finished" podID="a33b8fae-e1c3-428a-b08e-1afb2e142412" containerID="5b4b17865b6da34210643f637f870e9da97fc8564e44eebcce23e600078ed06c" exitCode=0 Mar 19 16:53:41 crc kubenswrapper[4918]: I0319 16:53:41.766535 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" event={"ID":"a33b8fae-e1c3-428a-b08e-1afb2e142412","Type":"ContainerDied","Data":"5b4b17865b6da34210643f637f870e9da97fc8564e44eebcce23e600078ed06c"} Mar 19 16:53:41 crc kubenswrapper[4918]: I0319 16:53:41.766566 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" event={"ID":"a33b8fae-e1c3-428a-b08e-1afb2e142412","Type":"ContainerStarted","Data":"10d909ce6f76e09d6297c854b6af48f179cf94c539e72c23874167925b5b7d50"} Mar 19 16:53:43 crc kubenswrapper[4918]: I0319 16:53:43.786133 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" event={"ID":"a33b8fae-e1c3-428a-b08e-1afb2e142412","Type":"ContainerStarted","Data":"a82bb5ab1c57e2cd964c9d84b938aac72ea380ecb02f47e3ecdf7ee41321c1af"} Mar 19 16:53:44 crc kubenswrapper[4918]: I0319 16:53:44.793840 4918 generic.go:334] "Generic (PLEG): container finished" podID="a33b8fae-e1c3-428a-b08e-1afb2e142412" containerID="a82bb5ab1c57e2cd964c9d84b938aac72ea380ecb02f47e3ecdf7ee41321c1af" exitCode=0 Mar 19 16:53:44 crc kubenswrapper[4918]: I0319 16:53:44.793914 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" event={"ID":"a33b8fae-e1c3-428a-b08e-1afb2e142412","Type":"ContainerDied","Data":"a82bb5ab1c57e2cd964c9d84b938aac72ea380ecb02f47e3ecdf7ee41321c1af"} Mar 19 16:53:45 crc kubenswrapper[4918]: I0319 16:53:45.806182 4918 generic.go:334] "Generic (PLEG): container finished" podID="a33b8fae-e1c3-428a-b08e-1afb2e142412" containerID="6704618823c065fec3a736fa1a56b23cb1cedb31f01ec7a2aa3944e9149add32" exitCode=0 Mar 19 16:53:45 crc kubenswrapper[4918]: I0319 16:53:45.806246 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" event={"ID":"a33b8fae-e1c3-428a-b08e-1afb2e142412","Type":"ContainerDied","Data":"6704618823c065fec3a736fa1a56b23cb1cedb31f01ec7a2aa3944e9149add32"} Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.079310 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.208485 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6z28\" (UniqueName: \"kubernetes.io/projected/a33b8fae-e1c3-428a-b08e-1afb2e142412-kube-api-access-l6z28\") pod \"a33b8fae-e1c3-428a-b08e-1afb2e142412\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.208689 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-util\") pod \"a33b8fae-e1c3-428a-b08e-1afb2e142412\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.208852 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-bundle\") pod \"a33b8fae-e1c3-428a-b08e-1afb2e142412\" (UID: \"a33b8fae-e1c3-428a-b08e-1afb2e142412\") " Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.209437 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-bundle" (OuterVolumeSpecName: "bundle") pod "a33b8fae-e1c3-428a-b08e-1afb2e142412" (UID: "a33b8fae-e1c3-428a-b08e-1afb2e142412"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.213755 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33b8fae-e1c3-428a-b08e-1afb2e142412-kube-api-access-l6z28" (OuterVolumeSpecName: "kube-api-access-l6z28") pod "a33b8fae-e1c3-428a-b08e-1afb2e142412" (UID: "a33b8fae-e1c3-428a-b08e-1afb2e142412"). InnerVolumeSpecName "kube-api-access-l6z28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.311192 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6z28\" (UniqueName: \"kubernetes.io/projected/a33b8fae-e1c3-428a-b08e-1afb2e142412-kube-api-access-l6z28\") on node \"crc\" DevicePath \"\"" Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.311245 4918 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.331064 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-util" (OuterVolumeSpecName: "util") pod "a33b8fae-e1c3-428a-b08e-1afb2e142412" (UID: "a33b8fae-e1c3-428a-b08e-1afb2e142412"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.412087 4918 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a33b8fae-e1c3-428a-b08e-1afb2e142412-util\") on node \"crc\" DevicePath \"\"" Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.823695 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" event={"ID":"a33b8fae-e1c3-428a-b08e-1afb2e142412","Type":"ContainerDied","Data":"10d909ce6f76e09d6297c854b6af48f179cf94c539e72c23874167925b5b7d50"} Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.823732 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10d909ce6f76e09d6297c854b6af48f179cf94c539e72c23874167925b5b7d50" Mar 19 16:53:47 crc kubenswrapper[4918]: I0319 16:53:47.823883 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn" Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.776225 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-ngkkz"] Mar 19 16:53:52 crc kubenswrapper[4918]: E0319 16:53:52.777748 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33b8fae-e1c3-428a-b08e-1afb2e142412" containerName="util" Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.777935 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33b8fae-e1c3-428a-b08e-1afb2e142412" containerName="util" Mar 19 16:53:52 crc kubenswrapper[4918]: E0319 16:53:52.777975 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33b8fae-e1c3-428a-b08e-1afb2e142412" containerName="pull" Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.777994 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33b8fae-e1c3-428a-b08e-1afb2e142412" containerName="pull" Mar 19 16:53:52 crc kubenswrapper[4918]: E0319 16:53:52.778020 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33b8fae-e1c3-428a-b08e-1afb2e142412" containerName="extract" Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.778041 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33b8fae-e1c3-428a-b08e-1afb2e142412" containerName="extract" Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.778292 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33b8fae-e1c3-428a-b08e-1afb2e142412" containerName="extract" Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.779655 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-ngkkz" Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.784908 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.784910 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.784926 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-mph65" Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.788740 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-ngkkz"] Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.881664 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krk4f\" (UniqueName: \"kubernetes.io/projected/12300214-878a-4066-9155-654f4a1c6e88-kube-api-access-krk4f\") pod \"nmstate-operator-796d4cfff4-ngkkz\" (UID: \"12300214-878a-4066-9155-654f4a1c6e88\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-ngkkz" Mar 19 16:53:52 crc kubenswrapper[4918]: I0319 16:53:52.982993 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krk4f\" (UniqueName: \"kubernetes.io/projected/12300214-878a-4066-9155-654f4a1c6e88-kube-api-access-krk4f\") pod \"nmstate-operator-796d4cfff4-ngkkz\" (UID: \"12300214-878a-4066-9155-654f4a1c6e88\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-ngkkz" Mar 19 16:53:53 crc kubenswrapper[4918]: I0319 16:53:53.003954 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krk4f\" (UniqueName: \"kubernetes.io/projected/12300214-878a-4066-9155-654f4a1c6e88-kube-api-access-krk4f\") pod \"nmstate-operator-796d4cfff4-ngkkz\" (UID: \"12300214-878a-4066-9155-654f4a1c6e88\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-ngkkz" Mar 19 16:53:53 crc kubenswrapper[4918]: I0319 16:53:53.108617 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-ngkkz" Mar 19 16:53:53 crc kubenswrapper[4918]: I0319 16:53:53.519214 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-ngkkz"] Mar 19 16:53:53 crc kubenswrapper[4918]: W0319 16:53:53.531499 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12300214_878a_4066_9155_654f4a1c6e88.slice/crio-0b7c199c37a431717f72671ef890eacee45929333d0815651aeac874ca6a02b6 WatchSource:0}: Error finding container 0b7c199c37a431717f72671ef890eacee45929333d0815651aeac874ca6a02b6: Status 404 returned error can't find the container with id 0b7c199c37a431717f72671ef890eacee45929333d0815651aeac874ca6a02b6 Mar 19 16:53:53 crc kubenswrapper[4918]: I0319 16:53:53.861583 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-ngkkz" event={"ID":"12300214-878a-4066-9155-654f4a1c6e88","Type":"ContainerStarted","Data":"0b7c199c37a431717f72671ef890eacee45929333d0815651aeac874ca6a02b6"} Mar 19 16:53:56 crc kubenswrapper[4918]: I0319 16:53:56.879840 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-ngkkz" event={"ID":"12300214-878a-4066-9155-654f4a1c6e88","Type":"ContainerStarted","Data":"f8722fc061b5ba49ed16ef5f889bae31148c404c63b087eae6ccd94103562c67"} Mar 19 16:53:56 crc kubenswrapper[4918]: I0319 16:53:56.897896 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-ngkkz" podStartSLOduration=2.539580204 podStartE2EDuration="4.89788039s" podCreationTimestamp="2026-03-19 16:53:52 +0000 UTC" firstStartedPulling="2026-03-19 16:53:53.534837267 +0000 UTC m=+845.657036555" lastFinishedPulling="2026-03-19 16:53:55.893137463 +0000 UTC m=+848.015336741" observedRunningTime="2026-03-19 16:53:56.893938863 +0000 UTC m=+849.016138111" watchObservedRunningTime="2026-03-19 16:53:56.89788039 +0000 UTC m=+849.020079638" Mar 19 16:53:58 crc kubenswrapper[4918]: I0319 16:53:58.212277 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:53:58 crc kubenswrapper[4918]: I0319 16:53:58.212681 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:53:58 crc kubenswrapper[4918]: I0319 16:53:58.212753 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:53:58 crc kubenswrapper[4918]: I0319 16:53:58.213629 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"565e47777606eec4fea1871e422ec1703bb3c3550c00f538a28da566b1063407"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 16:53:58 crc kubenswrapper[4918]: I0319 16:53:58.213709 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://565e47777606eec4fea1871e422ec1703bb3c3550c00f538a28da566b1063407" gracePeriod=600 Mar 19 16:53:58 crc kubenswrapper[4918]: I0319 16:53:58.911295 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="565e47777606eec4fea1871e422ec1703bb3c3550c00f538a28da566b1063407" exitCode=0 Mar 19 16:53:58 crc kubenswrapper[4918]: I0319 16:53:58.911609 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"565e47777606eec4fea1871e422ec1703bb3c3550c00f538a28da566b1063407"} Mar 19 16:53:58 crc kubenswrapper[4918]: I0319 16:53:58.911634 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"ef900c9cacbbbaa6c19a9d710e828147883364fff3c1249c3116a090d326556c"} Mar 19 16:53:58 crc kubenswrapper[4918]: I0319 16:53:58.911649 4918 scope.go:117] "RemoveContainer" containerID="c18ec4217b3a3ee71c0f992036fec96e3dee59c09db0e6a1b2184c49948e8e26" Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.156264 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565654-k4nkc"] Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.158840 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565654-k4nkc" Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.163350 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.163626 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.163668 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.171464 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565654-k4nkc"] Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.288749 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55tmr\" (UniqueName: \"kubernetes.io/projected/7d407a1d-5f74-4628-bf4e-47e9fad34bb5-kube-api-access-55tmr\") pod \"auto-csr-approver-29565654-k4nkc\" (UID: \"7d407a1d-5f74-4628-bf4e-47e9fad34bb5\") " pod="openshift-infra/auto-csr-approver-29565654-k4nkc" Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.389566 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55tmr\" (UniqueName: \"kubernetes.io/projected/7d407a1d-5f74-4628-bf4e-47e9fad34bb5-kube-api-access-55tmr\") pod \"auto-csr-approver-29565654-k4nkc\" (UID: \"7d407a1d-5f74-4628-bf4e-47e9fad34bb5\") " pod="openshift-infra/auto-csr-approver-29565654-k4nkc" Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.406767 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55tmr\" (UniqueName: \"kubernetes.io/projected/7d407a1d-5f74-4628-bf4e-47e9fad34bb5-kube-api-access-55tmr\") pod \"auto-csr-approver-29565654-k4nkc\" (UID: \"7d407a1d-5f74-4628-bf4e-47e9fad34bb5\") " pod="openshift-infra/auto-csr-approver-29565654-k4nkc" Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.509772 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565654-k4nkc" Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.879193 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565654-k4nkc"] Mar 19 16:54:00 crc kubenswrapper[4918]: W0319 16:54:00.885793 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d407a1d_5f74_4628_bf4e_47e9fad34bb5.slice/crio-d79bf83ad3d10909a84a835bc84c4ef2995da8634fae0bb01411cdcebc79cfa4 WatchSource:0}: Error finding container d79bf83ad3d10909a84a835bc84c4ef2995da8634fae0bb01411cdcebc79cfa4: Status 404 returned error can't find the container with id d79bf83ad3d10909a84a835bc84c4ef2995da8634fae0bb01411cdcebc79cfa4 Mar 19 16:54:00 crc kubenswrapper[4918]: I0319 16:54:00.935056 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565654-k4nkc" event={"ID":"7d407a1d-5f74-4628-bf4e-47e9fad34bb5","Type":"ContainerStarted","Data":"d79bf83ad3d10909a84a835bc84c4ef2995da8634fae0bb01411cdcebc79cfa4"} Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.679349 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd"] Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.680809 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.683148 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-7gdpr" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.686382 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg"] Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.687962 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.694662 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.713692 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg"] Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.728591 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-c8b48"] Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.729578 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.748690 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd"] Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.822753 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3eaea7a7-ad66-4cd2-9678-df63c825a501-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-g7jgg\" (UID: \"3eaea7a7-ad66-4cd2-9678-df63c825a501\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.822841 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f68d263f-a5cd-4f12-ba53-0179e79cff40-ovs-socket\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.822863 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7ps5\" (UniqueName: \"kubernetes.io/projected/8394065e-bb9c-483d-b1bb-5f10bd07d0c4-kube-api-access-n7ps5\") pod \"nmstate-metrics-9b8c8685d-6rwfd\" (UID: \"8394065e-bb9c-483d-b1bb-5f10bd07d0c4\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.822881 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj4t5\" (UniqueName: \"kubernetes.io/projected/3eaea7a7-ad66-4cd2-9678-df63c825a501-kube-api-access-kj4t5\") pod \"nmstate-webhook-5f558f5558-g7jgg\" (UID: \"3eaea7a7-ad66-4cd2-9678-df63c825a501\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.822901 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f68d263f-a5cd-4f12-ba53-0179e79cff40-dbus-socket\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.822916 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f68d263f-a5cd-4f12-ba53-0179e79cff40-nmstate-lock\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.822930 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdvv\" (UniqueName: \"kubernetes.io/projected/f68d263f-a5cd-4f12-ba53-0179e79cff40-kube-api-access-vjdvv\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.848465 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp"] Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.849349 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.854463 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.859555 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.859664 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4crfn" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.860084 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp"] Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924031 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6982f526-6a77-4d0c-91e5-cf2714c78706-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-jd2lp\" (UID: \"6982f526-6a77-4d0c-91e5-cf2714c78706\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924099 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7ps5\" (UniqueName: \"kubernetes.io/projected/8394065e-bb9c-483d-b1bb-5f10bd07d0c4-kube-api-access-n7ps5\") pod \"nmstate-metrics-9b8c8685d-6rwfd\" (UID: \"8394065e-bb9c-483d-b1bb-5f10bd07d0c4\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924128 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f68d263f-a5cd-4f12-ba53-0179e79cff40-ovs-socket\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924151 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj4t5\" (UniqueName: \"kubernetes.io/projected/3eaea7a7-ad66-4cd2-9678-df63c825a501-kube-api-access-kj4t5\") pod \"nmstate-webhook-5f558f5558-g7jgg\" (UID: \"3eaea7a7-ad66-4cd2-9678-df63c825a501\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924206 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f68d263f-a5cd-4f12-ba53-0179e79cff40-dbus-socket\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924229 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f68d263f-a5cd-4f12-ba53-0179e79cff40-nmstate-lock\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924247 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdvv\" (UniqueName: \"kubernetes.io/projected/f68d263f-a5cd-4f12-ba53-0179e79cff40-kube-api-access-vjdvv\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924261 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f68d263f-a5cd-4f12-ba53-0179e79cff40-ovs-socket\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924269 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6982f526-6a77-4d0c-91e5-cf2714c78706-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-jd2lp\" (UID: \"6982f526-6a77-4d0c-91e5-cf2714c78706\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924311 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f68d263f-a5cd-4f12-ba53-0179e79cff40-nmstate-lock\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924313 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4pwj\" (UniqueName: \"kubernetes.io/projected/6982f526-6a77-4d0c-91e5-cf2714c78706-kube-api-access-w4pwj\") pod \"nmstate-console-plugin-86f58fcf4-jd2lp\" (UID: \"6982f526-6a77-4d0c-91e5-cf2714c78706\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924480 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3eaea7a7-ad66-4cd2-9678-df63c825a501-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-g7jgg\" (UID: \"3eaea7a7-ad66-4cd2-9678-df63c825a501\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" Mar 19 16:54:01 crc kubenswrapper[4918]: E0319 16:54:01.924594 4918 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 19 16:54:01 crc kubenswrapper[4918]: E0319 16:54:01.924644 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eaea7a7-ad66-4cd2-9678-df63c825a501-tls-key-pair podName:3eaea7a7-ad66-4cd2-9678-df63c825a501 nodeName:}" failed. No retries permitted until 2026-03-19 16:54:02.424624953 +0000 UTC m=+854.546824201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/3eaea7a7-ad66-4cd2-9678-df63c825a501-tls-key-pair") pod "nmstate-webhook-5f558f5558-g7jgg" (UID: "3eaea7a7-ad66-4cd2-9678-df63c825a501") : secret "openshift-nmstate-webhook" not found Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.924659 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f68d263f-a5cd-4f12-ba53-0179e79cff40-dbus-socket\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.945933 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdvv\" (UniqueName: \"kubernetes.io/projected/f68d263f-a5cd-4f12-ba53-0179e79cff40-kube-api-access-vjdvv\") pod \"nmstate-handler-c8b48\" (UID: \"f68d263f-a5cd-4f12-ba53-0179e79cff40\") " pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.950657 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7ps5\" (UniqueName: \"kubernetes.io/projected/8394065e-bb9c-483d-b1bb-5f10bd07d0c4-kube-api-access-n7ps5\") pod \"nmstate-metrics-9b8c8685d-6rwfd\" (UID: \"8394065e-bb9c-483d-b1bb-5f10bd07d0c4\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd" Mar 19 16:54:01 crc kubenswrapper[4918]: I0319 16:54:01.952319 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj4t5\" (UniqueName: \"kubernetes.io/projected/3eaea7a7-ad66-4cd2-9678-df63c825a501-kube-api-access-kj4t5\") pod \"nmstate-webhook-5f558f5558-g7jgg\" (UID: \"3eaea7a7-ad66-4cd2-9678-df63c825a501\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.009122 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.025908 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6982f526-6a77-4d0c-91e5-cf2714c78706-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-jd2lp\" (UID: \"6982f526-6a77-4d0c-91e5-cf2714c78706\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.026233 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6982f526-6a77-4d0c-91e5-cf2714c78706-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-jd2lp\" (UID: \"6982f526-6a77-4d0c-91e5-cf2714c78706\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.026250 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4pwj\" (UniqueName: \"kubernetes.io/projected/6982f526-6a77-4d0c-91e5-cf2714c78706-kube-api-access-w4pwj\") pod \"nmstate-console-plugin-86f58fcf4-jd2lp\" (UID: \"6982f526-6a77-4d0c-91e5-cf2714c78706\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:02 crc kubenswrapper[4918]: E0319 16:54:02.026693 4918 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 19 16:54:02 crc kubenswrapper[4918]: E0319 16:54:02.026780 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6982f526-6a77-4d0c-91e5-cf2714c78706-plugin-serving-cert podName:6982f526-6a77-4d0c-91e5-cf2714c78706 nodeName:}" failed. No retries permitted until 2026-03-19 16:54:02.526759329 +0000 UTC m=+854.648958577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/6982f526-6a77-4d0c-91e5-cf2714c78706-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-jd2lp" (UID: "6982f526-6a77-4d0c-91e5-cf2714c78706") : secret "plugin-serving-cert" not found Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.027316 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6982f526-6a77-4d0c-91e5-cf2714c78706-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-jd2lp\" (UID: \"6982f526-6a77-4d0c-91e5-cf2714c78706\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.057838 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.058465 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4pwj\" (UniqueName: \"kubernetes.io/projected/6982f526-6a77-4d0c-91e5-cf2714c78706-kube-api-access-w4pwj\") pod \"nmstate-console-plugin-86f58fcf4-jd2lp\" (UID: \"6982f526-6a77-4d0c-91e5-cf2714c78706\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.070223 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-557cfffb7b-snjf9"] Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.074926 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.099387 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557cfffb7b-snjf9"] Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.127231 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-trusted-ca-bundle\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.127292 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc8a485f-cd57-40e5-9b76-43573d35982c-console-serving-cert\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.127345 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-service-ca\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.127382 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc8a485f-cd57-40e5-9b76-43573d35982c-console-oauth-config\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.127412 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r6v2\" (UniqueName: \"kubernetes.io/projected/dc8a485f-cd57-40e5-9b76-43573d35982c-kube-api-access-8r6v2\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.127448 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-oauth-serving-cert\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.127469 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-console-config\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: W0319 16:54:02.127668 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf68d263f_a5cd_4f12_ba53_0179e79cff40.slice/crio-ec5aa2550eca5d3c4fcbe811ddb4a462f2798419eaee682e1b028643e89b0065 WatchSource:0}: Error finding container ec5aa2550eca5d3c4fcbe811ddb4a462f2798419eaee682e1b028643e89b0065: Status 404 returned error can't find the container with id ec5aa2550eca5d3c4fcbe811ddb4a462f2798419eaee682e1b028643e89b0065 Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.228427 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-trusted-ca-bundle\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.228482 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc8a485f-cd57-40e5-9b76-43573d35982c-console-serving-cert\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.228530 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-service-ca\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.228560 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc8a485f-cd57-40e5-9b76-43573d35982c-console-oauth-config\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.228582 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r6v2\" (UniqueName: \"kubernetes.io/projected/dc8a485f-cd57-40e5-9b76-43573d35982c-kube-api-access-8r6v2\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.228612 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-oauth-serving-cert\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.228631 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-console-config\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.229489 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-console-config\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.229780 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-service-ca\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.229935 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-trusted-ca-bundle\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.230017 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc8a485f-cd57-40e5-9b76-43573d35982c-oauth-serving-cert\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.255118 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc8a485f-cd57-40e5-9b76-43573d35982c-console-oauth-config\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.258900 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc8a485f-cd57-40e5-9b76-43573d35982c-console-serving-cert\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.279642 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r6v2\" (UniqueName: \"kubernetes.io/projected/dc8a485f-cd57-40e5-9b76-43573d35982c-kube-api-access-8r6v2\") pod \"console-557cfffb7b-snjf9\" (UID: \"dc8a485f-cd57-40e5-9b76-43573d35982c\") " pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.379002 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd"] Mar 19 16:54:02 crc kubenswrapper[4918]: W0319 16:54:02.386377 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8394065e_bb9c_483d_b1bb_5f10bd07d0c4.slice/crio-e6be55e5aab15855325f03dacc4212691632ea7c73884e93942453f43e9351fb WatchSource:0}: Error finding container e6be55e5aab15855325f03dacc4212691632ea7c73884e93942453f43e9351fb: Status 404 returned error can't find the container with id e6be55e5aab15855325f03dacc4212691632ea7c73884e93942453f43e9351fb Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.411305 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.431704 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3eaea7a7-ad66-4cd2-9678-df63c825a501-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-g7jgg\" (UID: \"3eaea7a7-ad66-4cd2-9678-df63c825a501\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.435712 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3eaea7a7-ad66-4cd2-9678-df63c825a501-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-g7jgg\" (UID: \"3eaea7a7-ad66-4cd2-9678-df63c825a501\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.533438 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6982f526-6a77-4d0c-91e5-cf2714c78706-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-jd2lp\" (UID: \"6982f526-6a77-4d0c-91e5-cf2714c78706\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.539292 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6982f526-6a77-4d0c-91e5-cf2714c78706-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-jd2lp\" (UID: \"6982f526-6a77-4d0c-91e5-cf2714c78706\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.604283 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557cfffb7b-snjf9"] Mar 19 16:54:02 crc kubenswrapper[4918]: W0319 16:54:02.617571 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc8a485f_cd57_40e5_9b76_43573d35982c.slice/crio-86c84b19fa84d892527c24526183f4234aab8fc66a2cf796bd578bbc44fd96bb WatchSource:0}: Error finding container 86c84b19fa84d892527c24526183f4234aab8fc66a2cf796bd578bbc44fd96bb: Status 404 returned error can't find the container with id 86c84b19fa84d892527c24526183f4234aab8fc66a2cf796bd578bbc44fd96bb Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.620032 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.767035 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.952847 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd" event={"ID":"8394065e-bb9c-483d-b1bb-5f10bd07d0c4","Type":"ContainerStarted","Data":"e6be55e5aab15855325f03dacc4212691632ea7c73884e93942453f43e9351fb"} Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.954589 4918 generic.go:334] "Generic (PLEG): container finished" podID="7d407a1d-5f74-4628-bf4e-47e9fad34bb5" containerID="0f00a8751911b4a10f3eb4db9177ebe9d088cb9b35e95d9a46190cac53dfe477" exitCode=0 Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.954656 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565654-k4nkc" event={"ID":"7d407a1d-5f74-4628-bf4e-47e9fad34bb5","Type":"ContainerDied","Data":"0f00a8751911b4a10f3eb4db9177ebe9d088cb9b35e95d9a46190cac53dfe477"} Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.957646 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-c8b48" event={"ID":"f68d263f-a5cd-4f12-ba53-0179e79cff40","Type":"ContainerStarted","Data":"ec5aa2550eca5d3c4fcbe811ddb4a462f2798419eaee682e1b028643e89b0065"} Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.963150 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557cfffb7b-snjf9" event={"ID":"dc8a485f-cd57-40e5-9b76-43573d35982c","Type":"ContainerStarted","Data":"7b389b3edb9d7f3b8a6b62935db620c723c04f9eb14de5c1cf0531423bab93ab"} Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.963185 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557cfffb7b-snjf9" event={"ID":"dc8a485f-cd57-40e5-9b76-43573d35982c","Type":"ContainerStarted","Data":"86c84b19fa84d892527c24526183f4234aab8fc66a2cf796bd578bbc44fd96bb"} Mar 19 16:54:02 crc kubenswrapper[4918]: I0319 16:54:02.988125 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-557cfffb7b-snjf9" podStartSLOduration=0.988103008 podStartE2EDuration="988.103008ms" podCreationTimestamp="2026-03-19 16:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:54:02.985717043 +0000 UTC m=+855.107916301" watchObservedRunningTime="2026-03-19 16:54:02.988103008 +0000 UTC m=+855.110302256" Mar 19 16:54:03 crc kubenswrapper[4918]: I0319 16:54:03.019339 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp"] Mar 19 16:54:03 crc kubenswrapper[4918]: I0319 16:54:03.044312 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg"] Mar 19 16:54:03 crc kubenswrapper[4918]: W0319 16:54:03.052004 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eaea7a7_ad66_4cd2_9678_df63c825a501.slice/crio-9a3001544d897da87033948e5ffcd7a4589d7dd2704bc2b957f08df4be335368 WatchSource:0}: Error finding container 9a3001544d897da87033948e5ffcd7a4589d7dd2704bc2b957f08df4be335368: Status 404 returned error can't find the container with id 9a3001544d897da87033948e5ffcd7a4589d7dd2704bc2b957f08df4be335368 Mar 19 16:54:03 crc kubenswrapper[4918]: I0319 16:54:03.972825 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" event={"ID":"3eaea7a7-ad66-4cd2-9678-df63c825a501","Type":"ContainerStarted","Data":"9a3001544d897da87033948e5ffcd7a4589d7dd2704bc2b957f08df4be335368"} Mar 19 16:54:03 crc kubenswrapper[4918]: I0319 16:54:03.975349 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" event={"ID":"6982f526-6a77-4d0c-91e5-cf2714c78706","Type":"ContainerStarted","Data":"990d24917ce54b6fead898572ba7f5991a68deab91da8358a39fc9877d3754be"} Mar 19 16:54:04 crc kubenswrapper[4918]: I0319 16:54:04.218757 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565654-k4nkc" Mar 19 16:54:04 crc kubenswrapper[4918]: I0319 16:54:04.359120 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55tmr\" (UniqueName: \"kubernetes.io/projected/7d407a1d-5f74-4628-bf4e-47e9fad34bb5-kube-api-access-55tmr\") pod \"7d407a1d-5f74-4628-bf4e-47e9fad34bb5\" (UID: \"7d407a1d-5f74-4628-bf4e-47e9fad34bb5\") " Mar 19 16:54:04 crc kubenswrapper[4918]: I0319 16:54:04.370718 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d407a1d-5f74-4628-bf4e-47e9fad34bb5-kube-api-access-55tmr" (OuterVolumeSpecName: "kube-api-access-55tmr") pod "7d407a1d-5f74-4628-bf4e-47e9fad34bb5" (UID: "7d407a1d-5f74-4628-bf4e-47e9fad34bb5"). InnerVolumeSpecName "kube-api-access-55tmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:54:04 crc kubenswrapper[4918]: I0319 16:54:04.460757 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55tmr\" (UniqueName: \"kubernetes.io/projected/7d407a1d-5f74-4628-bf4e-47e9fad34bb5-kube-api-access-55tmr\") on node \"crc\" DevicePath \"\"" Mar 19 16:54:04 crc kubenswrapper[4918]: I0319 16:54:04.983703 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565654-k4nkc" event={"ID":"7d407a1d-5f74-4628-bf4e-47e9fad34bb5","Type":"ContainerDied","Data":"d79bf83ad3d10909a84a835bc84c4ef2995da8634fae0bb01411cdcebc79cfa4"} Mar 19 16:54:04 crc kubenswrapper[4918]: I0319 16:54:04.983752 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d79bf83ad3d10909a84a835bc84c4ef2995da8634fae0bb01411cdcebc79cfa4" Mar 19 16:54:04 crc kubenswrapper[4918]: I0319 16:54:04.983798 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565654-k4nkc" Mar 19 16:54:05 crc kubenswrapper[4918]: I0319 16:54:05.270472 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565648-6jrtk"] Mar 19 16:54:05 crc kubenswrapper[4918]: I0319 16:54:05.276006 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565648-6jrtk"] Mar 19 16:54:06 crc kubenswrapper[4918]: I0319 16:54:06.004684 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd" event={"ID":"8394065e-bb9c-483d-b1bb-5f10bd07d0c4","Type":"ContainerStarted","Data":"cdf765f6a62e7b68af125643be05420158507e29c5d592251e92c1ce5fc8d613"} Mar 19 16:54:06 crc kubenswrapper[4918]: I0319 16:54:06.007924 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" event={"ID":"3eaea7a7-ad66-4cd2-9678-df63c825a501","Type":"ContainerStarted","Data":"f3fd40f150091f84e12820575d1db9f1faf339e2749872f7d3ae7c226e2f13f5"} Mar 19 16:54:06 crc kubenswrapper[4918]: I0319 16:54:06.008116 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" Mar 19 16:54:06 crc kubenswrapper[4918]: I0319 16:54:06.009967 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-c8b48" event={"ID":"f68d263f-a5cd-4f12-ba53-0179e79cff40","Type":"ContainerStarted","Data":"400e226e224440df5c5fab96c53efae2788da8f9917e753cb3b0ba31f31ac6af"} Mar 19 16:54:06 crc kubenswrapper[4918]: I0319 16:54:06.010307 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:06 crc kubenswrapper[4918]: I0319 16:54:06.012287 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" event={"ID":"6982f526-6a77-4d0c-91e5-cf2714c78706","Type":"ContainerStarted","Data":"3c3d8b7594af9b0c8a30366fbcf93b2a122486350034309688437e4a1009fddd"} Mar 19 16:54:06 crc kubenswrapper[4918]: I0319 16:54:06.031736 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" podStartSLOduration=2.591860429 podStartE2EDuration="5.031716377s" podCreationTimestamp="2026-03-19 16:54:01 +0000 UTC" firstStartedPulling="2026-03-19 16:54:03.053864419 +0000 UTC m=+855.176063667" lastFinishedPulling="2026-03-19 16:54:05.493720357 +0000 UTC m=+857.615919615" observedRunningTime="2026-03-19 16:54:06.026940565 +0000 UTC m=+858.149139833" watchObservedRunningTime="2026-03-19 16:54:06.031716377 +0000 UTC m=+858.153915625" Mar 19 16:54:06 crc kubenswrapper[4918]: I0319 16:54:06.063989 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-c8b48" podStartSLOduration=1.704107113 podStartE2EDuration="5.06397003s" podCreationTimestamp="2026-03-19 16:54:01 +0000 UTC" firstStartedPulling="2026-03-19 16:54:02.132258557 +0000 UTC m=+854.254457805" lastFinishedPulling="2026-03-19 16:54:05.492121454 +0000 UTC m=+857.614320722" observedRunningTime="2026-03-19 16:54:06.051868828 +0000 UTC m=+858.174068086" watchObservedRunningTime="2026-03-19 16:54:06.06397003 +0000 UTC m=+858.186169278" Mar 19 16:54:06 crc kubenswrapper[4918]: I0319 16:54:06.065928 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-jd2lp" podStartSLOduration=2.600646018 podStartE2EDuration="5.065915742s" podCreationTimestamp="2026-03-19 16:54:01 +0000 UTC" firstStartedPulling="2026-03-19 16:54:03.027020774 +0000 UTC m=+855.149220022" lastFinishedPulling="2026-03-19 16:54:05.492290478 +0000 UTC m=+857.614489746" observedRunningTime="2026-03-19 16:54:06.063829876 +0000 UTC m=+858.186029124" watchObservedRunningTime="2026-03-19 16:54:06.065915742 +0000 UTC m=+858.188114990" Mar 19 16:54:06 crc kubenswrapper[4918]: I0319 16:54:06.598090 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcff5a82-0812-414f-8918-7db313699c5e" path="/var/lib/kubelet/pods/dcff5a82-0812-414f-8918-7db313699c5e/volumes" Mar 19 16:54:09 crc kubenswrapper[4918]: I0319 16:54:09.036605 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd" event={"ID":"8394065e-bb9c-483d-b1bb-5f10bd07d0c4","Type":"ContainerStarted","Data":"40a74e409646e3a1fd9f5764dc8e8b82ec4e98636d4b62fb793a5fdd53a6bf05"} Mar 19 16:54:09 crc kubenswrapper[4918]: I0319 16:54:09.067003 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-6rwfd" podStartSLOduration=2.356598148 podStartE2EDuration="8.066972866s" podCreationTimestamp="2026-03-19 16:54:01 +0000 UTC" firstStartedPulling="2026-03-19 16:54:02.389660575 +0000 UTC m=+854.511859823" lastFinishedPulling="2026-03-19 16:54:08.100035293 +0000 UTC m=+860.222234541" observedRunningTime="2026-03-19 16:54:09.060803707 +0000 UTC m=+861.183003005" watchObservedRunningTime="2026-03-19 16:54:09.066972866 +0000 UTC m=+861.189172144" Mar 19 16:54:12 crc kubenswrapper[4918]: I0319 16:54:12.087727 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-c8b48" Mar 19 16:54:12 crc kubenswrapper[4918]: I0319 16:54:12.412236 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:12 crc kubenswrapper[4918]: I0319 16:54:12.412326 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:12 crc kubenswrapper[4918]: I0319 16:54:12.419575 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:13 crc kubenswrapper[4918]: I0319 16:54:13.075314 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-557cfffb7b-snjf9" Mar 19 16:54:13 crc kubenswrapper[4918]: I0319 16:54:13.146299 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-td7k5"] Mar 19 16:54:22 crc kubenswrapper[4918]: I0319 16:54:22.629356 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-g7jgg" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.204070 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-td7k5" podUID="5189c318-e4b1-4dd9-9a6d-284425d319cf" containerName="console" containerID="cri-o://83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e" gracePeriod=15 Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.613737 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-td7k5_5189c318-e4b1-4dd9-9a6d-284425d319cf/console/0.log" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.613957 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.680386 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-config\") pod \"5189c318-e4b1-4dd9-9a6d-284425d319cf\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.680504 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-oauth-serving-cert\") pod \"5189c318-e4b1-4dd9-9a6d-284425d319cf\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.680566 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-serving-cert\") pod \"5189c318-e4b1-4dd9-9a6d-284425d319cf\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.680594 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-oauth-config\") pod \"5189c318-e4b1-4dd9-9a6d-284425d319cf\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.680636 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-trusted-ca-bundle\") pod \"5189c318-e4b1-4dd9-9a6d-284425d319cf\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.680661 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2fpr\" (UniqueName: \"kubernetes.io/projected/5189c318-e4b1-4dd9-9a6d-284425d319cf-kube-api-access-q2fpr\") pod \"5189c318-e4b1-4dd9-9a6d-284425d319cf\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.680763 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-service-ca\") pod \"5189c318-e4b1-4dd9-9a6d-284425d319cf\" (UID: \"5189c318-e4b1-4dd9-9a6d-284425d319cf\") " Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.681299 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5189c318-e4b1-4dd9-9a6d-284425d319cf" (UID: "5189c318-e4b1-4dd9-9a6d-284425d319cf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.681335 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5189c318-e4b1-4dd9-9a6d-284425d319cf" (UID: "5189c318-e4b1-4dd9-9a6d-284425d319cf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.681648 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-config" (OuterVolumeSpecName: "console-config") pod "5189c318-e4b1-4dd9-9a6d-284425d319cf" (UID: "5189c318-e4b1-4dd9-9a6d-284425d319cf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.681670 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-service-ca" (OuterVolumeSpecName: "service-ca") pod "5189c318-e4b1-4dd9-9a6d-284425d319cf" (UID: "5189c318-e4b1-4dd9-9a6d-284425d319cf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.682380 4918 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.682399 4918 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.682408 4918 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.682416 4918 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.687093 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5189c318-e4b1-4dd9-9a6d-284425d319cf" (UID: "5189c318-e4b1-4dd9-9a6d-284425d319cf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.687398 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5189c318-e4b1-4dd9-9a6d-284425d319cf" (UID: "5189c318-e4b1-4dd9-9a6d-284425d319cf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.691645 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5189c318-e4b1-4dd9-9a6d-284425d319cf-kube-api-access-q2fpr" (OuterVolumeSpecName: "kube-api-access-q2fpr") pod "5189c318-e4b1-4dd9-9a6d-284425d319cf" (UID: "5189c318-e4b1-4dd9-9a6d-284425d319cf"). InnerVolumeSpecName "kube-api-access-q2fpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.783490 4918 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.783842 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2fpr\" (UniqueName: \"kubernetes.io/projected/5189c318-e4b1-4dd9-9a6d-284425d319cf-kube-api-access-q2fpr\") on node \"crc\" DevicePath \"\"" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.783854 4918 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5189c318-e4b1-4dd9-9a6d-284425d319cf-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.834312 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n"] Mar 19 16:54:38 crc kubenswrapper[4918]: E0319 16:54:38.834593 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5189c318-e4b1-4dd9-9a6d-284425d319cf" containerName="console" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.834611 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="5189c318-e4b1-4dd9-9a6d-284425d319cf" containerName="console" Mar 19 16:54:38 crc kubenswrapper[4918]: E0319 16:54:38.834625 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d407a1d-5f74-4628-bf4e-47e9fad34bb5" containerName="oc" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.834631 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d407a1d-5f74-4628-bf4e-47e9fad34bb5" containerName="oc" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.834737 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d407a1d-5f74-4628-bf4e-47e9fad34bb5" containerName="oc" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.834751 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="5189c318-e4b1-4dd9-9a6d-284425d319cf" containerName="console" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.835578 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.840534 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.849322 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n"] Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.885461 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5xqd\" (UniqueName: \"kubernetes.io/projected/155d539d-8319-43c6-9652-1af4e68bfe13-kube-api-access-m5xqd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.885509 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.885557 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.987252 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5xqd\" (UniqueName: \"kubernetes.io/projected/155d539d-8319-43c6-9652-1af4e68bfe13-kube-api-access-m5xqd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.987328 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.987372 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.987794 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:38 crc kubenswrapper[4918]: I0319 16:54:38.987955 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.003965 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5xqd\" (UniqueName: \"kubernetes.io/projected/155d539d-8319-43c6-9652-1af4e68bfe13-kube-api-access-m5xqd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.150641 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.265402 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-td7k5_5189c318-e4b1-4dd9-9a6d-284425d319cf/console/0.log" Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.265461 4918 generic.go:334] "Generic (PLEG): container finished" podID="5189c318-e4b1-4dd9-9a6d-284425d319cf" containerID="83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e" exitCode=2 Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.265498 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-td7k5" event={"ID":"5189c318-e4b1-4dd9-9a6d-284425d319cf","Type":"ContainerDied","Data":"83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e"} Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.265547 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-td7k5" event={"ID":"5189c318-e4b1-4dd9-9a6d-284425d319cf","Type":"ContainerDied","Data":"2bf9d91ff354780f10ca10b51e4c46dd4d967e6d76a889c72819f102e01d42da"} Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.265569 4918 scope.go:117] "RemoveContainer" containerID="83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e" Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.265589 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-td7k5" Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.295662 4918 scope.go:117] "RemoveContainer" containerID="83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e" Mar 19 16:54:39 crc kubenswrapper[4918]: E0319 16:54:39.296901 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e\": container with ID starting with 83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e not found: ID does not exist" containerID="83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e" Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.296942 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e"} err="failed to get container status \"83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e\": rpc error: code = NotFound desc = could not find container \"83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e\": container with ID starting with 83292303409edfc96f98ff86345fedf468cab0b5a7f15bd0c77162802186718e not found: ID does not exist" Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.305675 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-td7k5"] Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.313723 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-td7k5"] Mar 19 16:54:39 crc kubenswrapper[4918]: I0319 16:54:39.625760 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n"] Mar 19 16:54:40 crc kubenswrapper[4918]: I0319 16:54:40.274715 4918 generic.go:334] "Generic (PLEG): container finished" podID="155d539d-8319-43c6-9652-1af4e68bfe13" containerID="5901083fd3765116c0f9173ccfa4811baa187cfe8f3801393deb86853df3140c" exitCode=0 Mar 19 16:54:40 crc kubenswrapper[4918]: I0319 16:54:40.274803 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" event={"ID":"155d539d-8319-43c6-9652-1af4e68bfe13","Type":"ContainerDied","Data":"5901083fd3765116c0f9173ccfa4811baa187cfe8f3801393deb86853df3140c"} Mar 19 16:54:40 crc kubenswrapper[4918]: I0319 16:54:40.274843 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" event={"ID":"155d539d-8319-43c6-9652-1af4e68bfe13","Type":"ContainerStarted","Data":"aef33bba21652387e216f74f6073a9df3163552c18ae1969afb7a64932c9c446"} Mar 19 16:54:40 crc kubenswrapper[4918]: I0319 16:54:40.598579 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5189c318-e4b1-4dd9-9a6d-284425d319cf" path="/var/lib/kubelet/pods/5189c318-e4b1-4dd9-9a6d-284425d319cf/volumes" Mar 19 16:54:42 crc kubenswrapper[4918]: I0319 16:54:42.296755 4918 generic.go:334] "Generic (PLEG): container finished" podID="155d539d-8319-43c6-9652-1af4e68bfe13" containerID="38282cb1e15838a66ec1e8f9a66c7a41d2b5badb9755345ac6f083dd52a00d79" exitCode=0 Mar 19 16:54:42 crc kubenswrapper[4918]: I0319 16:54:42.296866 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" event={"ID":"155d539d-8319-43c6-9652-1af4e68bfe13","Type":"ContainerDied","Data":"38282cb1e15838a66ec1e8f9a66c7a41d2b5badb9755345ac6f083dd52a00d79"} Mar 19 16:54:43 crc kubenswrapper[4918]: I0319 16:54:43.308426 4918 generic.go:334] "Generic (PLEG): container finished" podID="155d539d-8319-43c6-9652-1af4e68bfe13" containerID="6e8344faa431e399ec331576acc18cbfcc58faa41d149ad5d792b6f5498e4946" exitCode=0 Mar 19 16:54:43 crc kubenswrapper[4918]: I0319 16:54:43.308482 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" event={"ID":"155d539d-8319-43c6-9652-1af4e68bfe13","Type":"ContainerDied","Data":"6e8344faa431e399ec331576acc18cbfcc58faa41d149ad5d792b6f5498e4946"} Mar 19 16:54:44 crc kubenswrapper[4918]: I0319 16:54:44.666840 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:44 crc kubenswrapper[4918]: I0319 16:54:44.766308 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5xqd\" (UniqueName: \"kubernetes.io/projected/155d539d-8319-43c6-9652-1af4e68bfe13-kube-api-access-m5xqd\") pod \"155d539d-8319-43c6-9652-1af4e68bfe13\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " Mar 19 16:54:44 crc kubenswrapper[4918]: I0319 16:54:44.766418 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-util\") pod \"155d539d-8319-43c6-9652-1af4e68bfe13\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " Mar 19 16:54:44 crc kubenswrapper[4918]: I0319 16:54:44.766466 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-bundle\") pod \"155d539d-8319-43c6-9652-1af4e68bfe13\" (UID: \"155d539d-8319-43c6-9652-1af4e68bfe13\") " Mar 19 16:54:44 crc kubenswrapper[4918]: I0319 16:54:44.768285 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-bundle" (OuterVolumeSpecName: "bundle") pod "155d539d-8319-43c6-9652-1af4e68bfe13" (UID: "155d539d-8319-43c6-9652-1af4e68bfe13"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:54:44 crc kubenswrapper[4918]: I0319 16:54:44.773676 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/155d539d-8319-43c6-9652-1af4e68bfe13-kube-api-access-m5xqd" (OuterVolumeSpecName: "kube-api-access-m5xqd") pod "155d539d-8319-43c6-9652-1af4e68bfe13" (UID: "155d539d-8319-43c6-9652-1af4e68bfe13"). InnerVolumeSpecName "kube-api-access-m5xqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:54:44 crc kubenswrapper[4918]: I0319 16:54:44.791989 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-util" (OuterVolumeSpecName: "util") pod "155d539d-8319-43c6-9652-1af4e68bfe13" (UID: "155d539d-8319-43c6-9652-1af4e68bfe13"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:54:44 crc kubenswrapper[4918]: I0319 16:54:44.868765 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5xqd\" (UniqueName: \"kubernetes.io/projected/155d539d-8319-43c6-9652-1af4e68bfe13-kube-api-access-m5xqd\") on node \"crc\" DevicePath \"\"" Mar 19 16:54:44 crc kubenswrapper[4918]: I0319 16:54:44.869016 4918 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-util\") on node \"crc\" DevicePath \"\"" Mar 19 16:54:44 crc kubenswrapper[4918]: I0319 16:54:44.869112 4918 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/155d539d-8319-43c6-9652-1af4e68bfe13-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:54:45 crc kubenswrapper[4918]: I0319 16:54:45.325661 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" event={"ID":"155d539d-8319-43c6-9652-1af4e68bfe13","Type":"ContainerDied","Data":"aef33bba21652387e216f74f6073a9df3163552c18ae1969afb7a64932c9c446"} Mar 19 16:54:45 crc kubenswrapper[4918]: I0319 16:54:45.325702 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aef33bba21652387e216f74f6073a9df3163552c18ae1969afb7a64932c9c446" Mar 19 16:54:45 crc kubenswrapper[4918]: I0319 16:54:45.325757 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n" Mar 19 16:54:50 crc kubenswrapper[4918]: I0319 16:54:50.286109 4918 scope.go:117] "RemoveContainer" containerID="e7e5d8d2f12c8db1eb7a2642acd884b688a04a15efb6c098866fb480ba2c15cd" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.243324 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk"] Mar 19 16:54:55 crc kubenswrapper[4918]: E0319 16:54:55.244229 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="155d539d-8319-43c6-9652-1af4e68bfe13" containerName="util" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.244247 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="155d539d-8319-43c6-9652-1af4e68bfe13" containerName="util" Mar 19 16:54:55 crc kubenswrapper[4918]: E0319 16:54:55.244259 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="155d539d-8319-43c6-9652-1af4e68bfe13" containerName="extract" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.244267 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="155d539d-8319-43c6-9652-1af4e68bfe13" containerName="extract" Mar 19 16:54:55 crc kubenswrapper[4918]: E0319 16:54:55.244280 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="155d539d-8319-43c6-9652-1af4e68bfe13" containerName="pull" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.244288 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="155d539d-8319-43c6-9652-1af4e68bfe13" containerName="pull" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.244416 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="155d539d-8319-43c6-9652-1af4e68bfe13" containerName="extract" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.244946 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.246770 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.247567 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.247629 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-78zxx" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.247813 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.248636 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.257197 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk"] Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.304757 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x47xr\" (UniqueName: \"kubernetes.io/projected/9fce7235-ee0c-4b6f-a51d-418382810cb2-kube-api-access-x47xr\") pod \"metallb-operator-controller-manager-67cf8697d8-q2jfk\" (UID: \"9fce7235-ee0c-4b6f-a51d-418382810cb2\") " pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.304850 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9fce7235-ee0c-4b6f-a51d-418382810cb2-apiservice-cert\") pod \"metallb-operator-controller-manager-67cf8697d8-q2jfk\" (UID: \"9fce7235-ee0c-4b6f-a51d-418382810cb2\") " pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.304875 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9fce7235-ee0c-4b6f-a51d-418382810cb2-webhook-cert\") pod \"metallb-operator-controller-manager-67cf8697d8-q2jfk\" (UID: \"9fce7235-ee0c-4b6f-a51d-418382810cb2\") " pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.406248 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9fce7235-ee0c-4b6f-a51d-418382810cb2-apiservice-cert\") pod \"metallb-operator-controller-manager-67cf8697d8-q2jfk\" (UID: \"9fce7235-ee0c-4b6f-a51d-418382810cb2\") " pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.406290 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9fce7235-ee0c-4b6f-a51d-418382810cb2-webhook-cert\") pod \"metallb-operator-controller-manager-67cf8697d8-q2jfk\" (UID: \"9fce7235-ee0c-4b6f-a51d-418382810cb2\") " pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.406339 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x47xr\" (UniqueName: \"kubernetes.io/projected/9fce7235-ee0c-4b6f-a51d-418382810cb2-kube-api-access-x47xr\") pod \"metallb-operator-controller-manager-67cf8697d8-q2jfk\" (UID: \"9fce7235-ee0c-4b6f-a51d-418382810cb2\") " pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.416292 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9fce7235-ee0c-4b6f-a51d-418382810cb2-webhook-cert\") pod \"metallb-operator-controller-manager-67cf8697d8-q2jfk\" (UID: \"9fce7235-ee0c-4b6f-a51d-418382810cb2\") " pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.429437 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x47xr\" (UniqueName: \"kubernetes.io/projected/9fce7235-ee0c-4b6f-a51d-418382810cb2-kube-api-access-x47xr\") pod \"metallb-operator-controller-manager-67cf8697d8-q2jfk\" (UID: \"9fce7235-ee0c-4b6f-a51d-418382810cb2\") " pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.432454 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9fce7235-ee0c-4b6f-a51d-418382810cb2-apiservice-cert\") pod \"metallb-operator-controller-manager-67cf8697d8-q2jfk\" (UID: \"9fce7235-ee0c-4b6f-a51d-418382810cb2\") " pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.559871 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.610073 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b4479595-2gx99"] Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.610963 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.612821 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.613129 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7wwpj" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.613129 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.634134 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b4479595-2gx99"] Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.709863 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6aa9115f-c65a-4e97-a868-4972a2a730e4-apiservice-cert\") pod \"metallb-operator-webhook-server-7b4479595-2gx99\" (UID: \"6aa9115f-c65a-4e97-a868-4972a2a730e4\") " pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.709944 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7vrx\" (UniqueName: \"kubernetes.io/projected/6aa9115f-c65a-4e97-a868-4972a2a730e4-kube-api-access-g7vrx\") pod \"metallb-operator-webhook-server-7b4479595-2gx99\" (UID: \"6aa9115f-c65a-4e97-a868-4972a2a730e4\") " pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.709986 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6aa9115f-c65a-4e97-a868-4972a2a730e4-webhook-cert\") pod \"metallb-operator-webhook-server-7b4479595-2gx99\" (UID: \"6aa9115f-c65a-4e97-a868-4972a2a730e4\") " pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.811138 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6aa9115f-c65a-4e97-a868-4972a2a730e4-apiservice-cert\") pod \"metallb-operator-webhook-server-7b4479595-2gx99\" (UID: \"6aa9115f-c65a-4e97-a868-4972a2a730e4\") " pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.811433 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7vrx\" (UniqueName: \"kubernetes.io/projected/6aa9115f-c65a-4e97-a868-4972a2a730e4-kube-api-access-g7vrx\") pod \"metallb-operator-webhook-server-7b4479595-2gx99\" (UID: \"6aa9115f-c65a-4e97-a868-4972a2a730e4\") " pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.811467 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6aa9115f-c65a-4e97-a868-4972a2a730e4-webhook-cert\") pod \"metallb-operator-webhook-server-7b4479595-2gx99\" (UID: \"6aa9115f-c65a-4e97-a868-4972a2a730e4\") " pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.814846 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6aa9115f-c65a-4e97-a868-4972a2a730e4-apiservice-cert\") pod \"metallb-operator-webhook-server-7b4479595-2gx99\" (UID: \"6aa9115f-c65a-4e97-a868-4972a2a730e4\") " pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.833586 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6aa9115f-c65a-4e97-a868-4972a2a730e4-webhook-cert\") pod \"metallb-operator-webhook-server-7b4479595-2gx99\" (UID: \"6aa9115f-c65a-4e97-a868-4972a2a730e4\") " pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.836224 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7vrx\" (UniqueName: \"kubernetes.io/projected/6aa9115f-c65a-4e97-a868-4972a2a730e4-kube-api-access-g7vrx\") pod \"metallb-operator-webhook-server-7b4479595-2gx99\" (UID: \"6aa9115f-c65a-4e97-a868-4972a2a730e4\") " pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:54:55 crc kubenswrapper[4918]: I0319 16:54:55.924978 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:54:56 crc kubenswrapper[4918]: I0319 16:54:56.088328 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk"] Mar 19 16:54:56 crc kubenswrapper[4918]: W0319 16:54:56.095113 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fce7235_ee0c_4b6f_a51d_418382810cb2.slice/crio-913c1981839efe66b143bfd8aac7f418e1636fe9cd997211c0a25f247552bea5 WatchSource:0}: Error finding container 913c1981839efe66b143bfd8aac7f418e1636fe9cd997211c0a25f247552bea5: Status 404 returned error can't find the container with id 913c1981839efe66b143bfd8aac7f418e1636fe9cd997211c0a25f247552bea5 Mar 19 16:54:56 crc kubenswrapper[4918]: I0319 16:54:56.346215 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b4479595-2gx99"] Mar 19 16:54:56 crc kubenswrapper[4918]: W0319 16:54:56.350899 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aa9115f_c65a_4e97_a868_4972a2a730e4.slice/crio-44f9fec36baacd33751e6095d5c987e55f9ff2343d07f65677853d7be7b73e5a WatchSource:0}: Error finding container 44f9fec36baacd33751e6095d5c987e55f9ff2343d07f65677853d7be7b73e5a: Status 404 returned error can't find the container with id 44f9fec36baacd33751e6095d5c987e55f9ff2343d07f65677853d7be7b73e5a Mar 19 16:54:56 crc kubenswrapper[4918]: I0319 16:54:56.394216 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" event={"ID":"9fce7235-ee0c-4b6f-a51d-418382810cb2","Type":"ContainerStarted","Data":"913c1981839efe66b143bfd8aac7f418e1636fe9cd997211c0a25f247552bea5"} Mar 19 16:54:56 crc kubenswrapper[4918]: I0319 16:54:56.395479 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" event={"ID":"6aa9115f-c65a-4e97-a868-4972a2a730e4","Type":"ContainerStarted","Data":"44f9fec36baacd33751e6095d5c987e55f9ff2343d07f65677853d7be7b73e5a"} Mar 19 16:55:01 crc kubenswrapper[4918]: I0319 16:55:01.453311 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" event={"ID":"9fce7235-ee0c-4b6f-a51d-418382810cb2","Type":"ContainerStarted","Data":"d28bd24fdfc0031e431ef50cdd1849e92e0a3c0d43b701a7317f6e0a6a42363a"} Mar 19 16:55:01 crc kubenswrapper[4918]: I0319 16:55:01.456908 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:55:01 crc kubenswrapper[4918]: I0319 16:55:01.485958 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" podStartSLOduration=2.2787016749999998 podStartE2EDuration="6.485941761s" podCreationTimestamp="2026-03-19 16:54:55 +0000 UTC" firstStartedPulling="2026-03-19 16:54:56.096976852 +0000 UTC m=+908.219176100" lastFinishedPulling="2026-03-19 16:55:00.304216938 +0000 UTC m=+912.426416186" observedRunningTime="2026-03-19 16:55:01.485215971 +0000 UTC m=+913.607415219" watchObservedRunningTime="2026-03-19 16:55:01.485941761 +0000 UTC m=+913.608141029" Mar 19 16:55:02 crc kubenswrapper[4918]: I0319 16:55:02.461061 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" event={"ID":"6aa9115f-c65a-4e97-a868-4972a2a730e4","Type":"ContainerStarted","Data":"212d6febabd91a1765e020fd448a8be728a476826385e0b1b28f32de9965d96f"} Mar 19 16:55:02 crc kubenswrapper[4918]: I0319 16:55:02.497972 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" podStartSLOduration=1.6592421960000001 podStartE2EDuration="7.497949548s" podCreationTimestamp="2026-03-19 16:54:55 +0000 UTC" firstStartedPulling="2026-03-19 16:54:56.356051275 +0000 UTC m=+908.478250523" lastFinishedPulling="2026-03-19 16:55:02.194758627 +0000 UTC m=+914.316957875" observedRunningTime="2026-03-19 16:55:02.49544837 +0000 UTC m=+914.617647618" watchObservedRunningTime="2026-03-19 16:55:02.497949548 +0000 UTC m=+914.620148806" Mar 19 16:55:03 crc kubenswrapper[4918]: I0319 16:55:03.468092 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:55:15 crc kubenswrapper[4918]: I0319 16:55:15.932932 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7b4479595-2gx99" Mar 19 16:55:35 crc kubenswrapper[4918]: I0319 16:55:35.563218 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-67cf8697d8-q2jfk" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.324592 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-zffq8"] Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.327171 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.329046 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9"] Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.329669 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kwvcq" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.330293 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.330425 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.331325 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.332753 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.351431 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9"] Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.390028 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991062e5-113f-40ff-9980-02cc5d5f70e0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pdrw9\" (UID: \"991062e5-113f-40ff-9980-02cc5d5f70e0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.390074 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-frr-sockets\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.390090 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-frr-conf\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.390144 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-metrics\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.390207 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvw9w\" (UniqueName: \"kubernetes.io/projected/991062e5-113f-40ff-9980-02cc5d5f70e0-kube-api-access-nvw9w\") pod \"frr-k8s-webhook-server-bcc4b6f68-pdrw9\" (UID: \"991062e5-113f-40ff-9980-02cc5d5f70e0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.390257 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5336426-61ac-4019-ac70-31c129c9a939-metrics-certs\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.390282 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-reloader\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.390315 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b5336426-61ac-4019-ac70-31c129c9a939-frr-startup\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.390370 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4m57\" (UniqueName: \"kubernetes.io/projected/b5336426-61ac-4019-ac70-31c129c9a939-kube-api-access-q4m57\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.418685 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-6jjlw"] Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.419500 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.420829 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.421404 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.421412 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.421867 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-rsrg6" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.440698 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-xvdsx"] Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.441636 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.443635 4918 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.458290 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-xvdsx"] Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.491653 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8xqj\" (UniqueName: \"kubernetes.io/projected/9786a14d-5680-49c8-9e93-764b32a73202-kube-api-access-w8xqj\") pod \"controller-7bb4cc7c98-xvdsx\" (UID: \"9786a14d-5680-49c8-9e93-764b32a73202\") " pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.491701 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-frr-sockets\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.491721 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991062e5-113f-40ff-9980-02cc5d5f70e0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pdrw9\" (UID: \"991062e5-113f-40ff-9980-02cc5d5f70e0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.491737 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpwck\" (UniqueName: \"kubernetes.io/projected/9d3715c6-f9c8-4863-9929-d804880ae4f7-kube-api-access-kpwck\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.491759 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-frr-conf\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.491776 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-metrics\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.491792 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvw9w\" (UniqueName: \"kubernetes.io/projected/991062e5-113f-40ff-9980-02cc5d5f70e0-kube-api-access-nvw9w\") pod \"frr-k8s-webhook-server-bcc4b6f68-pdrw9\" (UID: \"991062e5-113f-40ff-9980-02cc5d5f70e0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.491808 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9786a14d-5680-49c8-9e93-764b32a73202-cert\") pod \"controller-7bb4cc7c98-xvdsx\" (UID: \"9786a14d-5680-49c8-9e93-764b32a73202\") " pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.491824 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-metrics-certs\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.491851 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5336426-61ac-4019-ac70-31c129c9a939-metrics-certs\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.491868 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-memberlist\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: E0319 16:55:36.491870 4918 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 19 16:55:36 crc kubenswrapper[4918]: E0319 16:55:36.491945 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/991062e5-113f-40ff-9980-02cc5d5f70e0-cert podName:991062e5-113f-40ff-9980-02cc5d5f70e0 nodeName:}" failed. No retries permitted until 2026-03-19 16:55:36.991927325 +0000 UTC m=+949.114126573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/991062e5-113f-40ff-9980-02cc5d5f70e0-cert") pod "frr-k8s-webhook-server-bcc4b6f68-pdrw9" (UID: "991062e5-113f-40ff-9980-02cc5d5f70e0") : secret "frr-k8s-webhook-server-cert" not found Mar 19 16:55:36 crc kubenswrapper[4918]: E0319 16:55:36.491991 4918 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 19 16:55:36 crc kubenswrapper[4918]: E0319 16:55:36.492064 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5336426-61ac-4019-ac70-31c129c9a939-metrics-certs podName:b5336426-61ac-4019-ac70-31c129c9a939 nodeName:}" failed. No retries permitted until 2026-03-19 16:55:36.992036968 +0000 UTC m=+949.114236216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5336426-61ac-4019-ac70-31c129c9a939-metrics-certs") pod "frr-k8s-zffq8" (UID: "b5336426-61ac-4019-ac70-31c129c9a939") : secret "frr-k8s-certs-secret" not found Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.492088 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-reloader\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.492146 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-metrics\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.492174 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b5336426-61ac-4019-ac70-31c129c9a939-frr-startup\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.492215 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9786a14d-5680-49c8-9e93-764b32a73202-metrics-certs\") pod \"controller-7bb4cc7c98-xvdsx\" (UID: \"9786a14d-5680-49c8-9e93-764b32a73202\") " pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.492246 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-frr-sockets\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.492277 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4m57\" (UniqueName: \"kubernetes.io/projected/b5336426-61ac-4019-ac70-31c129c9a939-kube-api-access-q4m57\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.492300 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9d3715c6-f9c8-4863-9929-d804880ae4f7-metallb-excludel2\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.492327 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-frr-conf\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.492516 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b5336426-61ac-4019-ac70-31c129c9a939-reloader\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.493124 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b5336426-61ac-4019-ac70-31c129c9a939-frr-startup\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.511203 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4m57\" (UniqueName: \"kubernetes.io/projected/b5336426-61ac-4019-ac70-31c129c9a939-kube-api-access-q4m57\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.511461 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvw9w\" (UniqueName: \"kubernetes.io/projected/991062e5-113f-40ff-9980-02cc5d5f70e0-kube-api-access-nvw9w\") pod \"frr-k8s-webhook-server-bcc4b6f68-pdrw9\" (UID: \"991062e5-113f-40ff-9980-02cc5d5f70e0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.592696 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9786a14d-5680-49c8-9e93-764b32a73202-metrics-certs\") pod \"controller-7bb4cc7c98-xvdsx\" (UID: \"9786a14d-5680-49c8-9e93-764b32a73202\") " pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.592757 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9d3715c6-f9c8-4863-9929-d804880ae4f7-metallb-excludel2\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.592834 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8xqj\" (UniqueName: \"kubernetes.io/projected/9786a14d-5680-49c8-9e93-764b32a73202-kube-api-access-w8xqj\") pod \"controller-7bb4cc7c98-xvdsx\" (UID: \"9786a14d-5680-49c8-9e93-764b32a73202\") " pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.592869 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpwck\" (UniqueName: \"kubernetes.io/projected/9d3715c6-f9c8-4863-9929-d804880ae4f7-kube-api-access-kpwck\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.592897 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9786a14d-5680-49c8-9e93-764b32a73202-cert\") pod \"controller-7bb4cc7c98-xvdsx\" (UID: \"9786a14d-5680-49c8-9e93-764b32a73202\") " pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.592919 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-metrics-certs\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.592974 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-memberlist\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: E0319 16:55:36.593453 4918 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 16:55:36 crc kubenswrapper[4918]: E0319 16:55:36.593612 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-memberlist podName:9d3715c6-f9c8-4863-9929-d804880ae4f7 nodeName:}" failed. No retries permitted until 2026-03-19 16:55:37.093507235 +0000 UTC m=+949.215706483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-memberlist") pod "speaker-6jjlw" (UID: "9d3715c6-f9c8-4863-9929-d804880ae4f7") : secret "metallb-memberlist" not found Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.593649 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9d3715c6-f9c8-4863-9929-d804880ae4f7-metallb-excludel2\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.597182 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-metrics-certs\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.597338 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9786a14d-5680-49c8-9e93-764b32a73202-metrics-certs\") pod \"controller-7bb4cc7c98-xvdsx\" (UID: \"9786a14d-5680-49c8-9e93-764b32a73202\") " pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.597493 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9786a14d-5680-49c8-9e93-764b32a73202-cert\") pod \"controller-7bb4cc7c98-xvdsx\" (UID: \"9786a14d-5680-49c8-9e93-764b32a73202\") " pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.609098 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8xqj\" (UniqueName: \"kubernetes.io/projected/9786a14d-5680-49c8-9e93-764b32a73202-kube-api-access-w8xqj\") pod \"controller-7bb4cc7c98-xvdsx\" (UID: \"9786a14d-5680-49c8-9e93-764b32a73202\") " pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.611746 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpwck\" (UniqueName: \"kubernetes.io/projected/9d3715c6-f9c8-4863-9929-d804880ae4f7-kube-api-access-kpwck\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.757213 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.997495 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991062e5-113f-40ff-9980-02cc5d5f70e0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pdrw9\" (UID: \"991062e5-113f-40ff-9980-02cc5d5f70e0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" Mar 19 16:55:36 crc kubenswrapper[4918]: I0319 16:55:36.997904 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5336426-61ac-4019-ac70-31c129c9a939-metrics-certs\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.001555 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/991062e5-113f-40ff-9980-02cc5d5f70e0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pdrw9\" (UID: \"991062e5-113f-40ff-9980-02cc5d5f70e0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.002323 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5336426-61ac-4019-ac70-31c129c9a939-metrics-certs\") pod \"frr-k8s-zffq8\" (UID: \"b5336426-61ac-4019-ac70-31c129c9a939\") " pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.099676 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-memberlist\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:37 crc kubenswrapper[4918]: E0319 16:55:37.099916 4918 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 16:55:37 crc kubenswrapper[4918]: E0319 16:55:37.100025 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-memberlist podName:9d3715c6-f9c8-4863-9929-d804880ae4f7 nodeName:}" failed. No retries permitted until 2026-03-19 16:55:38.100000502 +0000 UTC m=+950.222199760 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-memberlist") pod "speaker-6jjlw" (UID: "9d3715c6-f9c8-4863-9929-d804880ae4f7") : secret "metallb-memberlist" not found Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.169800 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-xvdsx"] Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.254485 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.300658 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.721115 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zffq8" event={"ID":"b5336426-61ac-4019-ac70-31c129c9a939","Type":"ContainerStarted","Data":"9d3694d371d0ce93785003702536eec8de5f1004a0c1b0a72e118d84778c30cc"} Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.723143 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-xvdsx" event={"ID":"9786a14d-5680-49c8-9e93-764b32a73202","Type":"ContainerStarted","Data":"df45c4ad7faf6bd8b67fbc08d2a2b0f1243e3e9dc5c4cf31732ed0319204716f"} Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.723188 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-xvdsx" event={"ID":"9786a14d-5680-49c8-9e93-764b32a73202","Type":"ContainerStarted","Data":"47d1e6feb660ae60876b9aa0603f8a0258bedc8cb781aaeedd00597670423952"} Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.723200 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-xvdsx" event={"ID":"9786a14d-5680-49c8-9e93-764b32a73202","Type":"ContainerStarted","Data":"48916999ac58c609806df5a8ee25698d4a8153e01bff9801ed4a84169da9eebf"} Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.723467 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.745722 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-xvdsx" podStartSLOduration=1.74570118 podStartE2EDuration="1.74570118s" podCreationTimestamp="2026-03-19 16:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:55:37.737054733 +0000 UTC m=+949.859253991" watchObservedRunningTime="2026-03-19 16:55:37.74570118 +0000 UTC m=+949.867900428" Mar 19 16:55:37 crc kubenswrapper[4918]: I0319 16:55:37.832899 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9"] Mar 19 16:55:38 crc kubenswrapper[4918]: I0319 16:55:38.112347 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-memberlist\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:38 crc kubenswrapper[4918]: I0319 16:55:38.119466 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9d3715c6-f9c8-4863-9929-d804880ae4f7-memberlist\") pod \"speaker-6jjlw\" (UID: \"9d3715c6-f9c8-4863-9929-d804880ae4f7\") " pod="metallb-system/speaker-6jjlw" Mar 19 16:55:38 crc kubenswrapper[4918]: I0319 16:55:38.232848 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6jjlw" Mar 19 16:55:38 crc kubenswrapper[4918]: W0319 16:55:38.260950 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d3715c6_f9c8_4863_9929_d804880ae4f7.slice/crio-7dce416b349eb909ac5316d877c1fac5bc7baa10f6b4ab9ca5a9233bfd824410 WatchSource:0}: Error finding container 7dce416b349eb909ac5316d877c1fac5bc7baa10f6b4ab9ca5a9233bfd824410: Status 404 returned error can't find the container with id 7dce416b349eb909ac5316d877c1fac5bc7baa10f6b4ab9ca5a9233bfd824410 Mar 19 16:55:38 crc kubenswrapper[4918]: I0319 16:55:38.734198 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" event={"ID":"991062e5-113f-40ff-9980-02cc5d5f70e0","Type":"ContainerStarted","Data":"da1e5a506c2a32759cf9172f23ad5d5dbaa4fcec6f93953f8f731e036daac26f"} Mar 19 16:55:38 crc kubenswrapper[4918]: I0319 16:55:38.735692 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6jjlw" event={"ID":"9d3715c6-f9c8-4863-9929-d804880ae4f7","Type":"ContainerStarted","Data":"a366bfff7266cd4179e4cc4d62cda271384b2e30865ba68aeaaa90c06a6dcf31"} Mar 19 16:55:38 crc kubenswrapper[4918]: I0319 16:55:38.735722 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6jjlw" event={"ID":"9d3715c6-f9c8-4863-9929-d804880ae4f7","Type":"ContainerStarted","Data":"7dce416b349eb909ac5316d877c1fac5bc7baa10f6b4ab9ca5a9233bfd824410"} Mar 19 16:55:39 crc kubenswrapper[4918]: I0319 16:55:39.762771 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6jjlw" event={"ID":"9d3715c6-f9c8-4863-9929-d804880ae4f7","Type":"ContainerStarted","Data":"1a12f0fc98971a39be965b41fd12ee7623b1278d28d10d701d84069e0e1b1158"} Mar 19 16:55:39 crc kubenswrapper[4918]: I0319 16:55:39.763829 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-6jjlw" Mar 19 16:55:39 crc kubenswrapper[4918]: I0319 16:55:39.796667 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-6jjlw" podStartSLOduration=3.7966524010000002 podStartE2EDuration="3.796652401s" podCreationTimestamp="2026-03-19 16:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:55:39.778114234 +0000 UTC m=+951.900313482" watchObservedRunningTime="2026-03-19 16:55:39.796652401 +0000 UTC m=+951.918851649" Mar 19 16:55:45 crc kubenswrapper[4918]: I0319 16:55:45.805705 4918 generic.go:334] "Generic (PLEG): container finished" podID="b5336426-61ac-4019-ac70-31c129c9a939" containerID="fb206a714cffc6b30084948bbd5b3452d3bfff521e7d3ee834bb6583354882ad" exitCode=0 Mar 19 16:55:45 crc kubenswrapper[4918]: I0319 16:55:45.805776 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zffq8" event={"ID":"b5336426-61ac-4019-ac70-31c129c9a939","Type":"ContainerDied","Data":"fb206a714cffc6b30084948bbd5b3452d3bfff521e7d3ee834bb6583354882ad"} Mar 19 16:55:45 crc kubenswrapper[4918]: I0319 16:55:45.807756 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" event={"ID":"991062e5-113f-40ff-9980-02cc5d5f70e0","Type":"ContainerStarted","Data":"73e73967888b2ba59742a7e14f261385b7281baf6b7f16991543d782f1fa7eb2"} Mar 19 16:55:45 crc kubenswrapper[4918]: I0319 16:55:45.808486 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" Mar 19 16:55:45 crc kubenswrapper[4918]: I0319 16:55:45.856508 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" podStartSLOduration=3.085430099 podStartE2EDuration="9.856493194s" podCreationTimestamp="2026-03-19 16:55:36 +0000 UTC" firstStartedPulling="2026-03-19 16:55:37.843503498 +0000 UTC m=+949.965702746" lastFinishedPulling="2026-03-19 16:55:44.614566593 +0000 UTC m=+956.736765841" observedRunningTime="2026-03-19 16:55:45.855985301 +0000 UTC m=+957.978184569" watchObservedRunningTime="2026-03-19 16:55:45.856493194 +0000 UTC m=+957.978692442" Mar 19 16:55:46 crc kubenswrapper[4918]: I0319 16:55:46.815000 4918 generic.go:334] "Generic (PLEG): container finished" podID="b5336426-61ac-4019-ac70-31c129c9a939" containerID="4565cb48c32173259b5ef764ab5deb7ae878d8eba9583100c4f6a4872af40532" exitCode=0 Mar 19 16:55:46 crc kubenswrapper[4918]: I0319 16:55:46.816024 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zffq8" event={"ID":"b5336426-61ac-4019-ac70-31c129c9a939","Type":"ContainerDied","Data":"4565cb48c32173259b5ef764ab5deb7ae878d8eba9583100c4f6a4872af40532"} Mar 19 16:55:47 crc kubenswrapper[4918]: I0319 16:55:47.826298 4918 generic.go:334] "Generic (PLEG): container finished" podID="b5336426-61ac-4019-ac70-31c129c9a939" containerID="456acd369e1e59d213434bb9485048afad58dd17c9e9c56a820aca2ddda7a446" exitCode=0 Mar 19 16:55:47 crc kubenswrapper[4918]: I0319 16:55:47.826409 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zffq8" event={"ID":"b5336426-61ac-4019-ac70-31c129c9a939","Type":"ContainerDied","Data":"456acd369e1e59d213434bb9485048afad58dd17c9e9c56a820aca2ddda7a446"} Mar 19 16:55:48 crc kubenswrapper[4918]: I0319 16:55:48.254026 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-6jjlw" Mar 19 16:55:48 crc kubenswrapper[4918]: I0319 16:55:48.839984 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zffq8" event={"ID":"b5336426-61ac-4019-ac70-31c129c9a939","Type":"ContainerStarted","Data":"85931e55c02326783df12791adde44eb2349645a0143a472b7b5ce55b11bf874"} Mar 19 16:55:48 crc kubenswrapper[4918]: I0319 16:55:48.840023 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zffq8" event={"ID":"b5336426-61ac-4019-ac70-31c129c9a939","Type":"ContainerStarted","Data":"68dd101309d442d33d7b559c2d1979ba8572e668f90f18e8eb82805e05881ef5"} Mar 19 16:55:48 crc kubenswrapper[4918]: I0319 16:55:48.840032 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zffq8" event={"ID":"b5336426-61ac-4019-ac70-31c129c9a939","Type":"ContainerStarted","Data":"fdc3917ad8d28b7fffeb238e679aa6f069dd6c380d3dd98d13059b0228791ca6"} Mar 19 16:55:48 crc kubenswrapper[4918]: I0319 16:55:48.840041 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zffq8" event={"ID":"b5336426-61ac-4019-ac70-31c129c9a939","Type":"ContainerStarted","Data":"6a52789e4c7842a99a4c5889a665bd8872d34a5480fb0e237677193ecf99f774"} Mar 19 16:55:48 crc kubenswrapper[4918]: I0319 16:55:48.840051 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zffq8" event={"ID":"b5336426-61ac-4019-ac70-31c129c9a939","Type":"ContainerStarted","Data":"e644eb4403e0defb051140e41fb8dcdf13e1d1740416cf1b97d2285013d42a4c"} Mar 19 16:55:49 crc kubenswrapper[4918]: I0319 16:55:49.851913 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zffq8" event={"ID":"b5336426-61ac-4019-ac70-31c129c9a939","Type":"ContainerStarted","Data":"90b57420d982e65fd84be49969f6561fd88da3fbda542e6dec81e820baec0020"} Mar 19 16:55:49 crc kubenswrapper[4918]: I0319 16:55:49.852145 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:49 crc kubenswrapper[4918]: I0319 16:55:49.894124 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-zffq8" podStartSLOduration=6.654833382 podStartE2EDuration="13.894105156s" podCreationTimestamp="2026-03-19 16:55:36 +0000 UTC" firstStartedPulling="2026-03-19 16:55:37.378310942 +0000 UTC m=+949.500510190" lastFinishedPulling="2026-03-19 16:55:44.617582676 +0000 UTC m=+956.739781964" observedRunningTime="2026-03-19 16:55:49.877393209 +0000 UTC m=+961.999592457" watchObservedRunningTime="2026-03-19 16:55:49.894105156 +0000 UTC m=+962.016304404" Mar 19 16:55:50 crc kubenswrapper[4918]: I0319 16:55:50.816969 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-97srs"] Mar 19 16:55:50 crc kubenswrapper[4918]: I0319 16:55:50.817771 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-97srs" Mar 19 16:55:50 crc kubenswrapper[4918]: I0319 16:55:50.821196 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-5jt79" Mar 19 16:55:50 crc kubenswrapper[4918]: I0319 16:55:50.821257 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 19 16:55:50 crc kubenswrapper[4918]: I0319 16:55:50.821265 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 19 16:55:50 crc kubenswrapper[4918]: I0319 16:55:50.836047 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-97srs"] Mar 19 16:55:50 crc kubenswrapper[4918]: I0319 16:55:50.920337 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmsgk\" (UniqueName: \"kubernetes.io/projected/b4d51188-42f2-494c-bfab-e6e7a0475b51-kube-api-access-tmsgk\") pod \"openstack-operator-index-97srs\" (UID: \"b4d51188-42f2-494c-bfab-e6e7a0475b51\") " pod="openstack-operators/openstack-operator-index-97srs" Mar 19 16:55:51 crc kubenswrapper[4918]: I0319 16:55:51.021926 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmsgk\" (UniqueName: \"kubernetes.io/projected/b4d51188-42f2-494c-bfab-e6e7a0475b51-kube-api-access-tmsgk\") pod \"openstack-operator-index-97srs\" (UID: \"b4d51188-42f2-494c-bfab-e6e7a0475b51\") " pod="openstack-operators/openstack-operator-index-97srs" Mar 19 16:55:51 crc kubenswrapper[4918]: I0319 16:55:51.050387 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmsgk\" (UniqueName: \"kubernetes.io/projected/b4d51188-42f2-494c-bfab-e6e7a0475b51-kube-api-access-tmsgk\") pod \"openstack-operator-index-97srs\" (UID: \"b4d51188-42f2-494c-bfab-e6e7a0475b51\") " pod="openstack-operators/openstack-operator-index-97srs" Mar 19 16:55:51 crc kubenswrapper[4918]: I0319 16:55:51.134153 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-97srs" Mar 19 16:55:51 crc kubenswrapper[4918]: I0319 16:55:51.367759 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-97srs"] Mar 19 16:55:51 crc kubenswrapper[4918]: I0319 16:55:51.867329 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-97srs" event={"ID":"b4d51188-42f2-494c-bfab-e6e7a0475b51","Type":"ContainerStarted","Data":"5e8cb92a1d8472e54d4dbc1d2478649913cd587da4497c6f95da065de681b4e6"} Mar 19 16:55:52 crc kubenswrapper[4918]: I0319 16:55:52.255356 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:52 crc kubenswrapper[4918]: I0319 16:55:52.294332 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:54 crc kubenswrapper[4918]: I0319 16:55:54.211771 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-97srs"] Mar 19 16:55:54 crc kubenswrapper[4918]: I0319 16:55:54.803825 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qg8xl"] Mar 19 16:55:54 crc kubenswrapper[4918]: I0319 16:55:54.805119 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qg8xl" Mar 19 16:55:54 crc kubenswrapper[4918]: I0319 16:55:54.811770 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qg8xl"] Mar 19 16:55:54 crc kubenswrapper[4918]: I0319 16:55:54.893336 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-97srs" event={"ID":"b4d51188-42f2-494c-bfab-e6e7a0475b51","Type":"ContainerStarted","Data":"0dd40ef54209db15302b074f3cc2825467077b5363dc9578c4d4d5af0929829c"} Mar 19 16:55:54 crc kubenswrapper[4918]: I0319 16:55:54.893467 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-97srs" podUID="b4d51188-42f2-494c-bfab-e6e7a0475b51" containerName="registry-server" containerID="cri-o://0dd40ef54209db15302b074f3cc2825467077b5363dc9578c4d4d5af0929829c" gracePeriod=2 Mar 19 16:55:54 crc kubenswrapper[4918]: I0319 16:55:54.914009 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-97srs" podStartSLOduration=1.709574821 podStartE2EDuration="4.91398347s" podCreationTimestamp="2026-03-19 16:55:50 +0000 UTC" firstStartedPulling="2026-03-19 16:55:51.380885782 +0000 UTC m=+963.503085030" lastFinishedPulling="2026-03-19 16:55:54.585294431 +0000 UTC m=+966.707493679" observedRunningTime="2026-03-19 16:55:54.911325167 +0000 UTC m=+967.033524435" watchObservedRunningTime="2026-03-19 16:55:54.91398347 +0000 UTC m=+967.036182748" Mar 19 16:55:54 crc kubenswrapper[4918]: I0319 16:55:54.978872 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbrn7\" (UniqueName: \"kubernetes.io/projected/673a75db-e104-419e-921d-99ba515af652-kube-api-access-kbrn7\") pod \"openstack-operator-index-qg8xl\" (UID: \"673a75db-e104-419e-921d-99ba515af652\") " pod="openstack-operators/openstack-operator-index-qg8xl" Mar 19 16:55:55 crc kubenswrapper[4918]: I0319 16:55:55.079746 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbrn7\" (UniqueName: \"kubernetes.io/projected/673a75db-e104-419e-921d-99ba515af652-kube-api-access-kbrn7\") pod \"openstack-operator-index-qg8xl\" (UID: \"673a75db-e104-419e-921d-99ba515af652\") " pod="openstack-operators/openstack-operator-index-qg8xl" Mar 19 16:55:55 crc kubenswrapper[4918]: I0319 16:55:55.114747 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbrn7\" (UniqueName: \"kubernetes.io/projected/673a75db-e104-419e-921d-99ba515af652-kube-api-access-kbrn7\") pod \"openstack-operator-index-qg8xl\" (UID: \"673a75db-e104-419e-921d-99ba515af652\") " pod="openstack-operators/openstack-operator-index-qg8xl" Mar 19 16:55:55 crc kubenswrapper[4918]: I0319 16:55:55.126331 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qg8xl" Mar 19 16:55:55 crc kubenswrapper[4918]: I0319 16:55:55.377675 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qg8xl"] Mar 19 16:55:55 crc kubenswrapper[4918]: I0319 16:55:55.921827 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qg8xl" event={"ID":"673a75db-e104-419e-921d-99ba515af652","Type":"ContainerStarted","Data":"3a55437cb093fa4293317653f8a3a0d2b72a5524fe13a3b7fcfc57062f636acf"} Mar 19 16:55:55 crc kubenswrapper[4918]: I0319 16:55:55.927317 4918 generic.go:334] "Generic (PLEG): container finished" podID="b4d51188-42f2-494c-bfab-e6e7a0475b51" containerID="0dd40ef54209db15302b074f3cc2825467077b5363dc9578c4d4d5af0929829c" exitCode=0 Mar 19 16:55:55 crc kubenswrapper[4918]: I0319 16:55:55.927357 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-97srs" event={"ID":"b4d51188-42f2-494c-bfab-e6e7a0475b51","Type":"ContainerDied","Data":"0dd40ef54209db15302b074f3cc2825467077b5363dc9578c4d4d5af0929829c"} Mar 19 16:55:55 crc kubenswrapper[4918]: I0319 16:55:55.927381 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-97srs" event={"ID":"b4d51188-42f2-494c-bfab-e6e7a0475b51","Type":"ContainerDied","Data":"5e8cb92a1d8472e54d4dbc1d2478649913cd587da4497c6f95da065de681b4e6"} Mar 19 16:55:55 crc kubenswrapper[4918]: I0319 16:55:55.927391 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e8cb92a1d8472e54d4dbc1d2478649913cd587da4497c6f95da065de681b4e6" Mar 19 16:55:55 crc kubenswrapper[4918]: I0319 16:55:55.938171 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-97srs" Mar 19 16:55:56 crc kubenswrapper[4918]: I0319 16:55:56.105506 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmsgk\" (UniqueName: \"kubernetes.io/projected/b4d51188-42f2-494c-bfab-e6e7a0475b51-kube-api-access-tmsgk\") pod \"b4d51188-42f2-494c-bfab-e6e7a0475b51\" (UID: \"b4d51188-42f2-494c-bfab-e6e7a0475b51\") " Mar 19 16:55:56 crc kubenswrapper[4918]: I0319 16:55:56.110820 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d51188-42f2-494c-bfab-e6e7a0475b51-kube-api-access-tmsgk" (OuterVolumeSpecName: "kube-api-access-tmsgk") pod "b4d51188-42f2-494c-bfab-e6e7a0475b51" (UID: "b4d51188-42f2-494c-bfab-e6e7a0475b51"). InnerVolumeSpecName "kube-api-access-tmsgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:55:56 crc kubenswrapper[4918]: I0319 16:55:56.207756 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmsgk\" (UniqueName: \"kubernetes.io/projected/b4d51188-42f2-494c-bfab-e6e7a0475b51-kube-api-access-tmsgk\") on node \"crc\" DevicePath \"\"" Mar 19 16:55:56 crc kubenswrapper[4918]: I0319 16:55:56.761855 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-xvdsx" Mar 19 16:55:56 crc kubenswrapper[4918]: I0319 16:55:56.935274 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-97srs" Mar 19 16:55:56 crc kubenswrapper[4918]: I0319 16:55:56.935287 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qg8xl" event={"ID":"673a75db-e104-419e-921d-99ba515af652","Type":"ContainerStarted","Data":"45c4298d3ddea26ef5dd8bce8b205e30c1a9cb640a22d3fc559e53dc1864284e"} Mar 19 16:55:56 crc kubenswrapper[4918]: I0319 16:55:56.956009 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qg8xl" podStartSLOduration=2.690246601 podStartE2EDuration="2.955987696s" podCreationTimestamp="2026-03-19 16:55:54 +0000 UTC" firstStartedPulling="2026-03-19 16:55:55.391985377 +0000 UTC m=+967.514184645" lastFinishedPulling="2026-03-19 16:55:55.657726462 +0000 UTC m=+967.779925740" observedRunningTime="2026-03-19 16:55:56.952623634 +0000 UTC m=+969.074822892" watchObservedRunningTime="2026-03-19 16:55:56.955987696 +0000 UTC m=+969.078186944" Mar 19 16:55:56 crc kubenswrapper[4918]: I0319 16:55:56.988411 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-97srs"] Mar 19 16:55:56 crc kubenswrapper[4918]: I0319 16:55:56.998233 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-97srs"] Mar 19 16:55:57 crc kubenswrapper[4918]: I0319 16:55:57.265698 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-zffq8" Mar 19 16:55:57 crc kubenswrapper[4918]: I0319 16:55:57.310265 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pdrw9" Mar 19 16:55:58 crc kubenswrapper[4918]: I0319 16:55:58.211899 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:55:58 crc kubenswrapper[4918]: I0319 16:55:58.212360 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:55:58 crc kubenswrapper[4918]: I0319 16:55:58.601634 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d51188-42f2-494c-bfab-e6e7a0475b51" path="/var/lib/kubelet/pods/b4d51188-42f2-494c-bfab-e6e7a0475b51/volumes" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.132676 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565656-p46j9"] Mar 19 16:56:00 crc kubenswrapper[4918]: E0319 16:56:00.132989 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d51188-42f2-494c-bfab-e6e7a0475b51" containerName="registry-server" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.133005 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d51188-42f2-494c-bfab-e6e7a0475b51" containerName="registry-server" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.133140 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d51188-42f2-494c-bfab-e6e7a0475b51" containerName="registry-server" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.133665 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565656-p46j9" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.136363 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.136538 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.143675 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565656-p46j9"] Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.146678 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.158769 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfbcr\" (UniqueName: \"kubernetes.io/projected/cb680de7-217b-493c-89c3-81f0b3e801fb-kube-api-access-vfbcr\") pod \"auto-csr-approver-29565656-p46j9\" (UID: \"cb680de7-217b-493c-89c3-81f0b3e801fb\") " pod="openshift-infra/auto-csr-approver-29565656-p46j9" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.259488 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfbcr\" (UniqueName: \"kubernetes.io/projected/cb680de7-217b-493c-89c3-81f0b3e801fb-kube-api-access-vfbcr\") pod \"auto-csr-approver-29565656-p46j9\" (UID: \"cb680de7-217b-493c-89c3-81f0b3e801fb\") " pod="openshift-infra/auto-csr-approver-29565656-p46j9" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.284029 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfbcr\" (UniqueName: \"kubernetes.io/projected/cb680de7-217b-493c-89c3-81f0b3e801fb-kube-api-access-vfbcr\") pod \"auto-csr-approver-29565656-p46j9\" (UID: \"cb680de7-217b-493c-89c3-81f0b3e801fb\") " pod="openshift-infra/auto-csr-approver-29565656-p46j9" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.458685 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565656-p46j9" Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.928701 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565656-p46j9"] Mar 19 16:56:00 crc kubenswrapper[4918]: I0319 16:56:00.966811 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565656-p46j9" event={"ID":"cb680de7-217b-493c-89c3-81f0b3e801fb","Type":"ContainerStarted","Data":"8897206d063ced23985ee839ab454e3d0212aa5106c5c667138414684b933a10"} Mar 19 16:56:02 crc kubenswrapper[4918]: I0319 16:56:02.988229 4918 generic.go:334] "Generic (PLEG): container finished" podID="cb680de7-217b-493c-89c3-81f0b3e801fb" containerID="fcbc594eafaba70bde98144b561c69eddfd531a5c6fba68f94194ebd03008415" exitCode=0 Mar 19 16:56:02 crc kubenswrapper[4918]: I0319 16:56:02.988327 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565656-p46j9" event={"ID":"cb680de7-217b-493c-89c3-81f0b3e801fb","Type":"ContainerDied","Data":"fcbc594eafaba70bde98144b561c69eddfd531a5c6fba68f94194ebd03008415"} Mar 19 16:56:04 crc kubenswrapper[4918]: I0319 16:56:04.308100 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565656-p46j9" Mar 19 16:56:04 crc kubenswrapper[4918]: I0319 16:56:04.320121 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfbcr\" (UniqueName: \"kubernetes.io/projected/cb680de7-217b-493c-89c3-81f0b3e801fb-kube-api-access-vfbcr\") pod \"cb680de7-217b-493c-89c3-81f0b3e801fb\" (UID: \"cb680de7-217b-493c-89c3-81f0b3e801fb\") " Mar 19 16:56:04 crc kubenswrapper[4918]: I0319 16:56:04.340712 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb680de7-217b-493c-89c3-81f0b3e801fb-kube-api-access-vfbcr" (OuterVolumeSpecName: "kube-api-access-vfbcr") pod "cb680de7-217b-493c-89c3-81f0b3e801fb" (UID: "cb680de7-217b-493c-89c3-81f0b3e801fb"). InnerVolumeSpecName "kube-api-access-vfbcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:56:04 crc kubenswrapper[4918]: I0319 16:56:04.421590 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfbcr\" (UniqueName: \"kubernetes.io/projected/cb680de7-217b-493c-89c3-81f0b3e801fb-kube-api-access-vfbcr\") on node \"crc\" DevicePath \"\"" Mar 19 16:56:05 crc kubenswrapper[4918]: I0319 16:56:05.011356 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565656-p46j9" event={"ID":"cb680de7-217b-493c-89c3-81f0b3e801fb","Type":"ContainerDied","Data":"8897206d063ced23985ee839ab454e3d0212aa5106c5c667138414684b933a10"} Mar 19 16:56:05 crc kubenswrapper[4918]: I0319 16:56:05.011393 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8897206d063ced23985ee839ab454e3d0212aa5106c5c667138414684b933a10" Mar 19 16:56:05 crc kubenswrapper[4918]: I0319 16:56:05.011509 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565656-p46j9" Mar 19 16:56:05 crc kubenswrapper[4918]: I0319 16:56:05.126503 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qg8xl" Mar 19 16:56:05 crc kubenswrapper[4918]: I0319 16:56:05.126754 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qg8xl" Mar 19 16:56:05 crc kubenswrapper[4918]: I0319 16:56:05.162650 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qg8xl" Mar 19 16:56:05 crc kubenswrapper[4918]: I0319 16:56:05.383816 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565650-qv7bd"] Mar 19 16:56:05 crc kubenswrapper[4918]: I0319 16:56:05.392816 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565650-qv7bd"] Mar 19 16:56:06 crc kubenswrapper[4918]: I0319 16:56:06.047975 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qg8xl" Mar 19 16:56:06 crc kubenswrapper[4918]: I0319 16:56:06.597032 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec881118-0ccc-40c5-aecd-4e1fb8691288" path="/var/lib/kubelet/pods/ec881118-0ccc-40c5-aecd-4e1fb8691288/volumes" Mar 19 16:56:13 crc kubenswrapper[4918]: I0319 16:56:13.784874 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w"] Mar 19 16:56:13 crc kubenswrapper[4918]: E0319 16:56:13.785649 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb680de7-217b-493c-89c3-81f0b3e801fb" containerName="oc" Mar 19 16:56:13 crc kubenswrapper[4918]: I0319 16:56:13.785664 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb680de7-217b-493c-89c3-81f0b3e801fb" containerName="oc" Mar 19 16:56:13 crc kubenswrapper[4918]: I0319 16:56:13.785947 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb680de7-217b-493c-89c3-81f0b3e801fb" containerName="oc" Mar 19 16:56:13 crc kubenswrapper[4918]: I0319 16:56:13.788006 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:13 crc kubenswrapper[4918]: I0319 16:56:13.790916 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ct9mq" Mar 19 16:56:13 crc kubenswrapper[4918]: I0319 16:56:13.798582 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w"] Mar 19 16:56:13 crc kubenswrapper[4918]: I0319 16:56:13.987961 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-util\") pod \"1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:13 crc kubenswrapper[4918]: I0319 16:56:13.988041 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml9r2\" (UniqueName: \"kubernetes.io/projected/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-kube-api-access-ml9r2\") pod \"1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:13 crc kubenswrapper[4918]: I0319 16:56:13.988111 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-bundle\") pod \"1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:14 crc kubenswrapper[4918]: I0319 16:56:14.089656 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-util\") pod \"1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:14 crc kubenswrapper[4918]: I0319 16:56:14.089777 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml9r2\" (UniqueName: \"kubernetes.io/projected/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-kube-api-access-ml9r2\") pod \"1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:14 crc kubenswrapper[4918]: I0319 16:56:14.089858 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-bundle\") pod \"1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:14 crc kubenswrapper[4918]: I0319 16:56:14.090372 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-util\") pod \"1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:14 crc kubenswrapper[4918]: I0319 16:56:14.090471 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-bundle\") pod \"1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:14 crc kubenswrapper[4918]: I0319 16:56:14.124413 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml9r2\" (UniqueName: \"kubernetes.io/projected/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-kube-api-access-ml9r2\") pod \"1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:14 crc kubenswrapper[4918]: I0319 16:56:14.410750 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:14 crc kubenswrapper[4918]: I0319 16:56:14.724597 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w"] Mar 19 16:56:14 crc kubenswrapper[4918]: W0319 16:56:14.734466 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode13a0a8b_c7bc_4065_9a26_86034a00f0ac.slice/crio-83b04499eda92ad0a0ab0332f6b1eaff1385f6d4ad20a18359c4be1a23dcc92c WatchSource:0}: Error finding container 83b04499eda92ad0a0ab0332f6b1eaff1385f6d4ad20a18359c4be1a23dcc92c: Status 404 returned error can't find the container with id 83b04499eda92ad0a0ab0332f6b1eaff1385f6d4ad20a18359c4be1a23dcc92c Mar 19 16:56:15 crc kubenswrapper[4918]: I0319 16:56:15.086685 4918 generic.go:334] "Generic (PLEG): container finished" podID="e13a0a8b-c7bc-4065-9a26-86034a00f0ac" containerID="d815ce4ac83ab943b3f04da6847c4d9f3aa06f1fa282cd4fafdd97e426c9a04f" exitCode=0 Mar 19 16:56:15 crc kubenswrapper[4918]: I0319 16:56:15.086761 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" event={"ID":"e13a0a8b-c7bc-4065-9a26-86034a00f0ac","Type":"ContainerDied","Data":"d815ce4ac83ab943b3f04da6847c4d9f3aa06f1fa282cd4fafdd97e426c9a04f"} Mar 19 16:56:15 crc kubenswrapper[4918]: I0319 16:56:15.086981 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" event={"ID":"e13a0a8b-c7bc-4065-9a26-86034a00f0ac","Type":"ContainerStarted","Data":"83b04499eda92ad0a0ab0332f6b1eaff1385f6d4ad20a18359c4be1a23dcc92c"} Mar 19 16:56:16 crc kubenswrapper[4918]: I0319 16:56:16.098488 4918 generic.go:334] "Generic (PLEG): container finished" podID="e13a0a8b-c7bc-4065-9a26-86034a00f0ac" containerID="b6f7dab68f86f71497afd3643ba5f3dec11be96846ddd15ab5986e7c7fde94bf" exitCode=0 Mar 19 16:56:16 crc kubenswrapper[4918]: I0319 16:56:16.098557 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" event={"ID":"e13a0a8b-c7bc-4065-9a26-86034a00f0ac","Type":"ContainerDied","Data":"b6f7dab68f86f71497afd3643ba5f3dec11be96846ddd15ab5986e7c7fde94bf"} Mar 19 16:56:17 crc kubenswrapper[4918]: I0319 16:56:17.111105 4918 generic.go:334] "Generic (PLEG): container finished" podID="e13a0a8b-c7bc-4065-9a26-86034a00f0ac" containerID="fdec0b1a6556563994bf10110751af6f279fecd5f67b4f17e57884f8b28cc26d" exitCode=0 Mar 19 16:56:17 crc kubenswrapper[4918]: I0319 16:56:17.111165 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" event={"ID":"e13a0a8b-c7bc-4065-9a26-86034a00f0ac","Type":"ContainerDied","Data":"fdec0b1a6556563994bf10110751af6f279fecd5f67b4f17e57884f8b28cc26d"} Mar 19 16:56:18 crc kubenswrapper[4918]: I0319 16:56:18.473846 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:18 crc kubenswrapper[4918]: I0319 16:56:18.599128 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml9r2\" (UniqueName: \"kubernetes.io/projected/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-kube-api-access-ml9r2\") pod \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " Mar 19 16:56:18 crc kubenswrapper[4918]: I0319 16:56:18.599367 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-bundle\") pod \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " Mar 19 16:56:18 crc kubenswrapper[4918]: I0319 16:56:18.599452 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-util\") pod \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\" (UID: \"e13a0a8b-c7bc-4065-9a26-86034a00f0ac\") " Mar 19 16:56:18 crc kubenswrapper[4918]: I0319 16:56:18.600723 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-bundle" (OuterVolumeSpecName: "bundle") pod "e13a0a8b-c7bc-4065-9a26-86034a00f0ac" (UID: "e13a0a8b-c7bc-4065-9a26-86034a00f0ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:56:18 crc kubenswrapper[4918]: I0319 16:56:18.605692 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-kube-api-access-ml9r2" (OuterVolumeSpecName: "kube-api-access-ml9r2") pod "e13a0a8b-c7bc-4065-9a26-86034a00f0ac" (UID: "e13a0a8b-c7bc-4065-9a26-86034a00f0ac"). InnerVolumeSpecName "kube-api-access-ml9r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:56:18 crc kubenswrapper[4918]: I0319 16:56:18.614138 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-util" (OuterVolumeSpecName: "util") pod "e13a0a8b-c7bc-4065-9a26-86034a00f0ac" (UID: "e13a0a8b-c7bc-4065-9a26-86034a00f0ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:56:18 crc kubenswrapper[4918]: I0319 16:56:18.701762 4918 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:56:18 crc kubenswrapper[4918]: I0319 16:56:18.701820 4918 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-util\") on node \"crc\" DevicePath \"\"" Mar 19 16:56:18 crc kubenswrapper[4918]: I0319 16:56:18.701842 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml9r2\" (UniqueName: \"kubernetes.io/projected/e13a0a8b-c7bc-4065-9a26-86034a00f0ac-kube-api-access-ml9r2\") on node \"crc\" DevicePath \"\"" Mar 19 16:56:19 crc kubenswrapper[4918]: I0319 16:56:19.128949 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" Mar 19 16:56:19 crc kubenswrapper[4918]: I0319 16:56:19.128850 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w" event={"ID":"e13a0a8b-c7bc-4065-9a26-86034a00f0ac","Type":"ContainerDied","Data":"83b04499eda92ad0a0ab0332f6b1eaff1385f6d4ad20a18359c4be1a23dcc92c"} Mar 19 16:56:19 crc kubenswrapper[4918]: I0319 16:56:19.132744 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83b04499eda92ad0a0ab0332f6b1eaff1385f6d4ad20a18359c4be1a23dcc92c" Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.385613 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz"] Mar 19 16:56:26 crc kubenswrapper[4918]: E0319 16:56:26.386440 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13a0a8b-c7bc-4065-9a26-86034a00f0ac" containerName="util" Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.386455 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13a0a8b-c7bc-4065-9a26-86034a00f0ac" containerName="util" Mar 19 16:56:26 crc kubenswrapper[4918]: E0319 16:56:26.386470 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13a0a8b-c7bc-4065-9a26-86034a00f0ac" containerName="extract" Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.386477 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13a0a8b-c7bc-4065-9a26-86034a00f0ac" containerName="extract" Mar 19 16:56:26 crc kubenswrapper[4918]: E0319 16:56:26.386494 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13a0a8b-c7bc-4065-9a26-86034a00f0ac" containerName="pull" Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.386502 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13a0a8b-c7bc-4065-9a26-86034a00f0ac" containerName="pull" Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.386702 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e13a0a8b-c7bc-4065-9a26-86034a00f0ac" containerName="extract" Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.387296 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz" Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.389328 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-6hzw7" Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.423672 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz"] Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.552251 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhzms\" (UniqueName: \"kubernetes.io/projected/13ca5024-984f-45b0-8235-f41f91664ef9-kube-api-access-jhzms\") pod \"openstack-operator-controller-init-7658474f4d-mb9mz\" (UID: \"13ca5024-984f-45b0-8235-f41f91664ef9\") " pod="openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz" Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.653554 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhzms\" (UniqueName: \"kubernetes.io/projected/13ca5024-984f-45b0-8235-f41f91664ef9-kube-api-access-jhzms\") pod \"openstack-operator-controller-init-7658474f4d-mb9mz\" (UID: \"13ca5024-984f-45b0-8235-f41f91664ef9\") " pod="openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz" Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.675723 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhzms\" (UniqueName: \"kubernetes.io/projected/13ca5024-984f-45b0-8235-f41f91664ef9-kube-api-access-jhzms\") pod \"openstack-operator-controller-init-7658474f4d-mb9mz\" (UID: \"13ca5024-984f-45b0-8235-f41f91664ef9\") " pod="openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz" Mar 19 16:56:26 crc kubenswrapper[4918]: I0319 16:56:26.707263 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz" Mar 19 16:56:27 crc kubenswrapper[4918]: I0319 16:56:27.025373 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz"] Mar 19 16:56:27 crc kubenswrapper[4918]: I0319 16:56:27.203973 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz" event={"ID":"13ca5024-984f-45b0-8235-f41f91664ef9","Type":"ContainerStarted","Data":"0d9167b53adbc77f84f4681fce57cb6796568f86089b2a80dacb3ffc1ad30707"} Mar 19 16:56:28 crc kubenswrapper[4918]: I0319 16:56:28.212101 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:56:28 crc kubenswrapper[4918]: I0319 16:56:28.212149 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:56:31 crc kubenswrapper[4918]: I0319 16:56:31.233643 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz" event={"ID":"13ca5024-984f-45b0-8235-f41f91664ef9","Type":"ContainerStarted","Data":"9d03a40abbf0dffa73564016b8871c6accababcbca2c70dd83e4d17b3a8ade0f"} Mar 19 16:56:31 crc kubenswrapper[4918]: I0319 16:56:31.234285 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz" Mar 19 16:56:31 crc kubenswrapper[4918]: I0319 16:56:31.273196 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz" podStartSLOduration=1.421766624 podStartE2EDuration="5.273173716s" podCreationTimestamp="2026-03-19 16:56:26 +0000 UTC" firstStartedPulling="2026-03-19 16:56:27.037354163 +0000 UTC m=+999.159553421" lastFinishedPulling="2026-03-19 16:56:30.888761255 +0000 UTC m=+1003.010960513" observedRunningTime="2026-03-19 16:56:31.269195267 +0000 UTC m=+1003.391394525" watchObservedRunningTime="2026-03-19 16:56:31.273173716 +0000 UTC m=+1003.395372974" Mar 19 16:56:36 crc kubenswrapper[4918]: I0319 16:56:36.713882 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7658474f4d-mb9mz" Mar 19 16:56:50 crc kubenswrapper[4918]: I0319 16:56:50.369748 4918 scope.go:117] "RemoveContainer" containerID="31a9104c699d65221949aa6a3dddec1ec55176353c8887000abb5b39d8efa122" Mar 19 16:56:58 crc kubenswrapper[4918]: I0319 16:56:58.212313 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:56:58 crc kubenswrapper[4918]: I0319 16:56:58.212940 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:56:58 crc kubenswrapper[4918]: I0319 16:56:58.213001 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:56:58 crc kubenswrapper[4918]: I0319 16:56:58.213785 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef900c9cacbbbaa6c19a9d710e828147883364fff3c1249c3116a090d326556c"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 16:56:58 crc kubenswrapper[4918]: I0319 16:56:58.213849 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://ef900c9cacbbbaa6c19a9d710e828147883364fff3c1249c3116a090d326556c" gracePeriod=600 Mar 19 16:56:58 crc kubenswrapper[4918]: I0319 16:56:58.447324 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="ef900c9cacbbbaa6c19a9d710e828147883364fff3c1249c3116a090d326556c" exitCode=0 Mar 19 16:56:58 crc kubenswrapper[4918]: I0319 16:56:58.447404 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"ef900c9cacbbbaa6c19a9d710e828147883364fff3c1249c3116a090d326556c"} Mar 19 16:56:58 crc kubenswrapper[4918]: I0319 16:56:58.447682 4918 scope.go:117] "RemoveContainer" containerID="565e47777606eec4fea1871e422ec1703bb3c3550c00f538a28da566b1063407" Mar 19 16:56:59 crc kubenswrapper[4918]: I0319 16:56:59.455207 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"d175bcf8fa8bff1bfb04d3a219eb7c4c6847a1adae22fbf62149bc4b8894f0f0"} Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.943201 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k"] Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.944416 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k" Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.946392 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vrnw2" Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.962052 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5"] Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.963009 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5" Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.965233 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-9vxbb" Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.976270 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj"] Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.977375 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj" Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.980010 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5tln2" Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.982810 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k"] Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.990345 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj"] Mar 19 16:57:12 crc kubenswrapper[4918]: I0319 16:57:12.995615 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.000431 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.001272 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.003668 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-htnml" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.011444 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.039322 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.040417 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.046317 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-gplqc" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.046343 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.049485 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.056816 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-78gkx" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.089932 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.090888 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.109856 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.116514 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-jhtwc" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.116723 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.122061 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.126550 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt7ls\" (UniqueName: \"kubernetes.io/projected/b059bb35-4870-4982-b62f-e70ffd0270d2-kube-api-access-gt7ls\") pod \"cinder-operator-controller-manager-8d58dc466-nmkqj\" (UID: \"b059bb35-4870-4982-b62f-e70ffd0270d2\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.126948 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd5gb\" (UniqueName: \"kubernetes.io/projected/2644d9c5-c386-4d63-9cf9-7517f4fd6cb0-kube-api-access-hd5gb\") pod \"heat-operator-controller-manager-67dd5f86f5-xl6zs\" (UID: \"2644d9c5-c386-4d63-9cf9-7517f4fd6cb0\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.127044 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjv5j\" (UniqueName: \"kubernetes.io/projected/e8c75c9e-0913-485c-a5fe-9c9bf6e4bc53-kube-api-access-jjv5j\") pod \"horizon-operator-controller-manager-8464cc45fb-k2f4b\" (UID: \"e8c75c9e-0913-485c-a5fe-9c9bf6e4bc53\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.127151 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2flp\" (UniqueName: \"kubernetes.io/projected/cf3083b2-86ce-4d01-97c5-9005f683ff62-kube-api-access-h2flp\") pod \"glance-operator-controller-manager-79df6bcc97-ql64p\" (UID: \"cf3083b2-86ce-4d01-97c5-9005f683ff62\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.127231 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwmcd\" (UniqueName: \"kubernetes.io/projected/3d624150-7673-43db-b503-ec532c7c00ca-kube-api-access-wwmcd\") pod \"infra-operator-controller-manager-7b9c774f96-msx8p\" (UID: \"3d624150-7673-43db-b503-ec532c7c00ca\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.127320 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj59x\" (UniqueName: \"kubernetes.io/projected/e9f74e44-e78b-4b23-b409-89af31c2dc82-kube-api-access-bj59x\") pod \"designate-operator-controller-manager-588d4d986b-w78t5\" (UID: \"e9f74e44-e78b-4b23-b409-89af31c2dc82\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.127560 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bcz2\" (UniqueName: \"kubernetes.io/projected/4ed700e4-0a35-4c6c-b57a-cde49d5f816c-kube-api-access-6bcz2\") pod \"barbican-operator-controller-manager-59bc569d95-wg98k\" (UID: \"4ed700e4-0a35-4c6c-b57a-cde49d5f816c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.128476 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert\") pod \"infra-operator-controller-manager-7b9c774f96-msx8p\" (UID: \"3d624150-7673-43db-b503-ec532c7c00ca\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.140615 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.152046 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.153018 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.155579 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-xpglc" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.188123 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.189324 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.195640 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-twm8r" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.198339 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.199330 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.203101 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-zjb4d" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.220660 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.229327 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjv5j\" (UniqueName: \"kubernetes.io/projected/e8c75c9e-0913-485c-a5fe-9c9bf6e4bc53-kube-api-access-jjv5j\") pod \"horizon-operator-controller-manager-8464cc45fb-k2f4b\" (UID: \"e8c75c9e-0913-485c-a5fe-9c9bf6e4bc53\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.229399 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2flp\" (UniqueName: \"kubernetes.io/projected/cf3083b2-86ce-4d01-97c5-9005f683ff62-kube-api-access-h2flp\") pod \"glance-operator-controller-manager-79df6bcc97-ql64p\" (UID: \"cf3083b2-86ce-4d01-97c5-9005f683ff62\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.229427 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwmcd\" (UniqueName: \"kubernetes.io/projected/3d624150-7673-43db-b503-ec532c7c00ca-kube-api-access-wwmcd\") pod \"infra-operator-controller-manager-7b9c774f96-msx8p\" (UID: \"3d624150-7673-43db-b503-ec532c7c00ca\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.229455 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj59x\" (UniqueName: \"kubernetes.io/projected/e9f74e44-e78b-4b23-b409-89af31c2dc82-kube-api-access-bj59x\") pod \"designate-operator-controller-manager-588d4d986b-w78t5\" (UID: \"e9f74e44-e78b-4b23-b409-89af31c2dc82\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.229486 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bcz2\" (UniqueName: \"kubernetes.io/projected/4ed700e4-0a35-4c6c-b57a-cde49d5f816c-kube-api-access-6bcz2\") pod \"barbican-operator-controller-manager-59bc569d95-wg98k\" (UID: \"4ed700e4-0a35-4c6c-b57a-cde49d5f816c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.229538 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert\") pod \"infra-operator-controller-manager-7b9c774f96-msx8p\" (UID: \"3d624150-7673-43db-b503-ec532c7c00ca\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.229576 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skt2s\" (UniqueName: \"kubernetes.io/projected/64edd3c9-61ea-4fc0-9a74-95a27c4bffc9-kube-api-access-skt2s\") pod \"ironic-operator-controller-manager-6f787dddc9-8tqxl\" (UID: \"64edd3c9-61ea-4fc0-9a74-95a27c4bffc9\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.229607 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt7ls\" (UniqueName: \"kubernetes.io/projected/b059bb35-4870-4982-b62f-e70ffd0270d2-kube-api-access-gt7ls\") pod \"cinder-operator-controller-manager-8d58dc466-nmkqj\" (UID: \"b059bb35-4870-4982-b62f-e70ffd0270d2\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.229633 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd5gb\" (UniqueName: \"kubernetes.io/projected/2644d9c5-c386-4d63-9cf9-7517f4fd6cb0-kube-api-access-hd5gb\") pod \"heat-operator-controller-manager-67dd5f86f5-xl6zs\" (UID: \"2644d9c5-c386-4d63-9cf9-7517f4fd6cb0\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.229663 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2bz7\" (UniqueName: \"kubernetes.io/projected/34d2b03c-e63e-425a-a48a-6c9c97508add-kube-api-access-l2bz7\") pod \"manila-operator-controller-manager-55f864c847-sgwx5\" (UID: \"34d2b03c-e63e-425a-a48a-6c9c97508add\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.229689 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qdfj\" (UniqueName: \"kubernetes.io/projected/b8633109-a56d-4535-b603-f75c257cb093-kube-api-access-4qdfj\") pod \"keystone-operator-controller-manager-768b96df4c-8b6nq\" (UID: \"b8633109-a56d-4535-b603-f75c257cb093\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq" Mar 19 16:57:13 crc kubenswrapper[4918]: E0319 16:57:13.229803 4918 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 16:57:13 crc kubenswrapper[4918]: E0319 16:57:13.229864 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert podName:3d624150-7673-43db-b503-ec532c7c00ca nodeName:}" failed. No retries permitted until 2026-03-19 16:57:13.729843742 +0000 UTC m=+1045.852043070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert") pod "infra-operator-controller-manager-7b9c774f96-msx8p" (UID: "3d624150-7673-43db-b503-ec532c7c00ca") : secret "infra-operator-webhook-server-cert" not found Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.248893 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.261459 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.262613 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.270507 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-55f2d" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.274772 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.304622 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2flp\" (UniqueName: \"kubernetes.io/projected/cf3083b2-86ce-4d01-97c5-9005f683ff62-kube-api-access-h2flp\") pod \"glance-operator-controller-manager-79df6bcc97-ql64p\" (UID: \"cf3083b2-86ce-4d01-97c5-9005f683ff62\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.309726 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjv5j\" (UniqueName: \"kubernetes.io/projected/e8c75c9e-0913-485c-a5fe-9c9bf6e4bc53-kube-api-access-jjv5j\") pod \"horizon-operator-controller-manager-8464cc45fb-k2f4b\" (UID: \"e8c75c9e-0913-485c-a5fe-9c9bf6e4bc53\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.311465 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bcz2\" (UniqueName: \"kubernetes.io/projected/4ed700e4-0a35-4c6c-b57a-cde49d5f816c-kube-api-access-6bcz2\") pod \"barbican-operator-controller-manager-59bc569d95-wg98k\" (UID: \"4ed700e4-0a35-4c6c-b57a-cde49d5f816c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.312383 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwmcd\" (UniqueName: \"kubernetes.io/projected/3d624150-7673-43db-b503-ec532c7c00ca-kube-api-access-wwmcd\") pod \"infra-operator-controller-manager-7b9c774f96-msx8p\" (UID: \"3d624150-7673-43db-b503-ec532c7c00ca\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.313752 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt7ls\" (UniqueName: \"kubernetes.io/projected/b059bb35-4870-4982-b62f-e70ffd0270d2-kube-api-access-gt7ls\") pod \"cinder-operator-controller-manager-8d58dc466-nmkqj\" (UID: \"b059bb35-4870-4982-b62f-e70ffd0270d2\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.318734 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd5gb\" (UniqueName: \"kubernetes.io/projected/2644d9c5-c386-4d63-9cf9-7517f4fd6cb0-kube-api-access-hd5gb\") pod \"heat-operator-controller-manager-67dd5f86f5-xl6zs\" (UID: \"2644d9c5-c386-4d63-9cf9-7517f4fd6cb0\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.333883 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.335071 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skt2s\" (UniqueName: \"kubernetes.io/projected/64edd3c9-61ea-4fc0-9a74-95a27c4bffc9-kube-api-access-skt2s\") pod \"ironic-operator-controller-manager-6f787dddc9-8tqxl\" (UID: \"64edd3c9-61ea-4fc0-9a74-95a27c4bffc9\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.335110 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2bz7\" (UniqueName: \"kubernetes.io/projected/34d2b03c-e63e-425a-a48a-6c9c97508add-kube-api-access-l2bz7\") pod \"manila-operator-controller-manager-55f864c847-sgwx5\" (UID: \"34d2b03c-e63e-425a-a48a-6c9c97508add\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.335138 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qdfj\" (UniqueName: \"kubernetes.io/projected/b8633109-a56d-4535-b603-f75c257cb093-kube-api-access-4qdfj\") pod \"keystone-operator-controller-manager-768b96df4c-8b6nq\" (UID: \"b8633109-a56d-4535-b603-f75c257cb093\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.335140 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj59x\" (UniqueName: \"kubernetes.io/projected/e9f74e44-e78b-4b23-b409-89af31c2dc82-kube-api-access-bj59x\") pod \"designate-operator-controller-manager-588d4d986b-w78t5\" (UID: \"e9f74e44-e78b-4b23-b409-89af31c2dc82\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.342709 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.367690 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.377374 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.381620 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.383637 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.386189 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-22z2x" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.396222 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.422716 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.432271 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qdfj\" (UniqueName: \"kubernetes.io/projected/b8633109-a56d-4535-b603-f75c257cb093-kube-api-access-4qdfj\") pod \"keystone-operator-controller-manager-768b96df4c-8b6nq\" (UID: \"b8633109-a56d-4535-b603-f75c257cb093\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.440391 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-87jz7" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.449878 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv2xz\" (UniqueName: \"kubernetes.io/projected/c2eafc06-6df4-440d-820f-aad17b6061d7-kube-api-access-mv2xz\") pod \"mariadb-operator-controller-manager-67ccfc9778-5dbck\" (UID: \"c2eafc06-6df4-440d-820f-aad17b6061d7\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.454964 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2bz7\" (UniqueName: \"kubernetes.io/projected/34d2b03c-e63e-425a-a48a-6c9c97508add-kube-api-access-l2bz7\") pod \"manila-operator-controller-manager-55f864c847-sgwx5\" (UID: \"34d2b03c-e63e-425a-a48a-6c9c97508add\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.478268 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skt2s\" (UniqueName: \"kubernetes.io/projected/64edd3c9-61ea-4fc0-9a74-95a27c4bffc9-kube-api-access-skt2s\") pod \"ironic-operator-controller-manager-6f787dddc9-8tqxl\" (UID: \"64edd3c9-61ea-4fc0-9a74-95a27c4bffc9\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.542016 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.542741 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.551784 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv2xz\" (UniqueName: \"kubernetes.io/projected/c2eafc06-6df4-440d-820f-aad17b6061d7-kube-api-access-mv2xz\") pod \"mariadb-operator-controller-manager-67ccfc9778-5dbck\" (UID: \"c2eafc06-6df4-440d-820f-aad17b6061d7\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.552140 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7zh7\" (UniqueName: \"kubernetes.io/projected/230477fa-ce49-4e4d-a0a0-5bf2538c5192-kube-api-access-h7zh7\") pod \"neutron-operator-controller-manager-767865f676-rkhtw\" (UID: \"230477fa-ce49-4e4d-a0a0-5bf2538c5192\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.552321 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8lfk\" (UniqueName: \"kubernetes.io/projected/d9f56510-e25f-4de5-85b6-3030e989d13d-kube-api-access-t8lfk\") pod \"nova-operator-controller-manager-5d488d59fb-w7rjd\" (UID: \"d9f56510-e25f-4de5-85b6-3030e989d13d\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.556839 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.572837 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.579626 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv2xz\" (UniqueName: \"kubernetes.io/projected/c2eafc06-6df4-440d-820f-aad17b6061d7-kube-api-access-mv2xz\") pod \"mariadb-operator-controller-manager-67ccfc9778-5dbck\" (UID: \"c2eafc06-6df4-440d-820f-aad17b6061d7\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.586015 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.587277 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.595158 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.602117 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-hgs5j" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.603077 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.606966 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.607141 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.624532 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.631943 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.632908 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.639752 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.639763 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-hl9jb" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.655908 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ghpt\" (UniqueName: \"kubernetes.io/projected/7be38652-3021-4349-be08-4759ee13141b-kube-api-access-7ghpt\") pod \"octavia-operator-controller-manager-5b9f45d989-26r58\" (UID: \"7be38652-3021-4349-be08-4759ee13141b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.656156 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7zh7\" (UniqueName: \"kubernetes.io/projected/230477fa-ce49-4e4d-a0a0-5bf2538c5192-kube-api-access-h7zh7\") pod \"neutron-operator-controller-manager-767865f676-rkhtw\" (UID: \"230477fa-ce49-4e4d-a0a0-5bf2538c5192\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.656200 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8lfk\" (UniqueName: \"kubernetes.io/projected/d9f56510-e25f-4de5-85b6-3030e989d13d-kube-api-access-t8lfk\") pod \"nova-operator-controller-manager-5d488d59fb-w7rjd\" (UID: \"d9f56510-e25f-4de5-85b6-3030e989d13d\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.682102 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8lfk\" (UniqueName: \"kubernetes.io/projected/d9f56510-e25f-4de5-85b6-3030e989d13d-kube-api-access-t8lfk\") pod \"nova-operator-controller-manager-5d488d59fb-w7rjd\" (UID: \"d9f56510-e25f-4de5-85b6-3030e989d13d\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.682457 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7zh7\" (UniqueName: \"kubernetes.io/projected/230477fa-ce49-4e4d-a0a0-5bf2538c5192-kube-api-access-h7zh7\") pod \"neutron-operator-controller-manager-767865f676-rkhtw\" (UID: \"230477fa-ce49-4e4d-a0a0-5bf2538c5192\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.682616 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.714358 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-svspw"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.717341 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.720981 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-nlvkb" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.721957 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.736307 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-svspw"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.745191 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.746311 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.749024 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-mm4pb" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.753357 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.754460 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.756055 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-cb8hb" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.758421 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnn9p\" (UniqueName: \"kubernetes.io/projected/d06a3c13-6323-4fef-9aec-101be98e242b-kube-api-access-qnn9p\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-c98pr\" (UID: \"d06a3c13-6323-4fef-9aec-101be98e242b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.758500 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert\") pod \"infra-operator-controller-manager-7b9c774f96-msx8p\" (UID: \"3d624150-7673-43db-b503-ec532c7c00ca\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.758587 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ghpt\" (UniqueName: \"kubernetes.io/projected/7be38652-3021-4349-be08-4759ee13141b-kube-api-access-7ghpt\") pod \"octavia-operator-controller-manager-5b9f45d989-26r58\" (UID: \"7be38652-3021-4349-be08-4759ee13141b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.758616 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-c98pr\" (UID: \"d06a3c13-6323-4fef-9aec-101be98e242b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:13 crc kubenswrapper[4918]: E0319 16:57:13.758802 4918 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 16:57:13 crc kubenswrapper[4918]: E0319 16:57:13.758853 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert podName:3d624150-7673-43db-b503-ec532c7c00ca nodeName:}" failed. No retries permitted until 2026-03-19 16:57:14.758835396 +0000 UTC m=+1046.881034634 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert") pod "infra-operator-controller-manager-7b9c774f96-msx8p" (UID: "3d624150-7673-43db-b503-ec532c7c00ca") : secret "infra-operator-webhook-server-cert" not found Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.771591 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.772698 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.774257 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.777785 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-2dtmg" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.783813 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.794669 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.810850 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ghpt\" (UniqueName: \"kubernetes.io/projected/7be38652-3021-4349-be08-4759ee13141b-kube-api-access-7ghpt\") pod \"octavia-operator-controller-manager-5b9f45d989-26r58\" (UID: \"7be38652-3021-4349-be08-4759ee13141b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.835157 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.836145 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.838660 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-r6pvr" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.844479 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.845549 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.853938 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.861535 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-c98pr\" (UID: \"d06a3c13-6323-4fef-9aec-101be98e242b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.861591 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz5qq\" (UniqueName: \"kubernetes.io/projected/c77d9bca-3548-4c60-aa31-ad1a70dac2f1-kube-api-access-cz5qq\") pod \"telemetry-operator-controller-manager-78877dc965-4vmdw\" (UID: \"c77d9bca-3548-4c60-aa31-ad1a70dac2f1\") " pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.861668 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnn9p\" (UniqueName: \"kubernetes.io/projected/d06a3c13-6323-4fef-9aec-101be98e242b-kube-api-access-qnn9p\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-c98pr\" (UID: \"d06a3c13-6323-4fef-9aec-101be98e242b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.861694 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgx6q\" (UniqueName: \"kubernetes.io/projected/f0824526-4903-49f0-bfcb-17298cc84eb6-kube-api-access-xgx6q\") pod \"placement-operator-controller-manager-5784578c99-8sxgk\" (UID: \"f0824526-4903-49f0-bfcb-17298cc84eb6\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.861906 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qb2q\" (UniqueName: \"kubernetes.io/projected/85790727-be18-4730-81e1-84022d4cead2-kube-api-access-4qb2q\") pod \"ovn-operator-controller-manager-884679f54-svspw\" (UID: \"85790727-be18-4730-81e1-84022d4cead2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.861946 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpqc9\" (UniqueName: \"kubernetes.io/projected/6d031265-e265-412e-931e-52ca3bb940b6-kube-api-access-lpqc9\") pod \"swift-operator-controller-manager-c674c5965-6jtnt\" (UID: \"6d031265-e265-412e-931e-52ca3bb940b6\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" Mar 19 16:57:13 crc kubenswrapper[4918]: E0319 16:57:13.861982 4918 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 16:57:13 crc kubenswrapper[4918]: E0319 16:57:13.862141 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert podName:d06a3c13-6323-4fef-9aec-101be98e242b nodeName:}" failed. No retries permitted until 2026-03-19 16:57:14.362090279 +0000 UTC m=+1046.484289577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-c98pr" (UID: "d06a3c13-6323-4fef-9aec-101be98e242b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.876010 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.877128 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.880147 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-glkrc" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.894758 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnn9p\" (UniqueName: \"kubernetes.io/projected/d06a3c13-6323-4fef-9aec-101be98e242b-kube-api-access-qnn9p\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-c98pr\" (UID: \"d06a3c13-6323-4fef-9aec-101be98e242b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.902052 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.916446 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.942038 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.943785 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.947001 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-72t2b" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.947172 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.949118 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.949878 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8"] Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.966324 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgx6q\" (UniqueName: \"kubernetes.io/projected/f0824526-4903-49f0-bfcb-17298cc84eb6-kube-api-access-xgx6q\") pod \"placement-operator-controller-manager-5784578c99-8sxgk\" (UID: \"f0824526-4903-49f0-bfcb-17298cc84eb6\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.966460 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpqc9\" (UniqueName: \"kubernetes.io/projected/6d031265-e265-412e-931e-52ca3bb940b6-kube-api-access-lpqc9\") pod \"swift-operator-controller-manager-c674c5965-6jtnt\" (UID: \"6d031265-e265-412e-931e-52ca3bb940b6\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.966500 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qb2q\" (UniqueName: \"kubernetes.io/projected/85790727-be18-4730-81e1-84022d4cead2-kube-api-access-4qb2q\") pod \"ovn-operator-controller-manager-884679f54-svspw\" (UID: \"85790727-be18-4730-81e1-84022d4cead2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.981351 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz5qq\" (UniqueName: \"kubernetes.io/projected/c77d9bca-3548-4c60-aa31-ad1a70dac2f1-kube-api-access-cz5qq\") pod \"telemetry-operator-controller-manager-78877dc965-4vmdw\" (UID: \"c77d9bca-3548-4c60-aa31-ad1a70dac2f1\") " pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.981495 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkq7q\" (UniqueName: \"kubernetes.io/projected/35067b62-32eb-4cb2-8fbd-91b82c4a38cb-kube-api-access-jkq7q\") pod \"test-operator-controller-manager-5c5cb9c4d7-9w6lf\" (UID: \"35067b62-32eb-4cb2-8fbd-91b82c4a38cb\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" Mar 19 16:57:13 crc kubenswrapper[4918]: I0319 16:57:13.981558 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwptb\" (UniqueName: \"kubernetes.io/projected/e54e9cfd-b6fe-4b00-a12f-20b153f05710-kube-api-access-rwptb\") pod \"watcher-operator-controller-manager-6c4d75f7f9-fv2wb\" (UID: \"e54e9cfd-b6fe-4b00-a12f-20b153f05710\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.008621 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgx6q\" (UniqueName: \"kubernetes.io/projected/f0824526-4903-49f0-bfcb-17298cc84eb6-kube-api-access-xgx6q\") pod \"placement-operator-controller-manager-5784578c99-8sxgk\" (UID: \"f0824526-4903-49f0-bfcb-17298cc84eb6\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.023124 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qb2q\" (UniqueName: \"kubernetes.io/projected/85790727-be18-4730-81e1-84022d4cead2-kube-api-access-4qb2q\") pod \"ovn-operator-controller-manager-884679f54-svspw\" (UID: \"85790727-be18-4730-81e1-84022d4cead2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.032344 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpqc9\" (UniqueName: \"kubernetes.io/projected/6d031265-e265-412e-931e-52ca3bb940b6-kube-api-access-lpqc9\") pod \"swift-operator-controller-manager-c674c5965-6jtnt\" (UID: \"6d031265-e265-412e-931e-52ca3bb940b6\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.035774 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.037354 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz5qq\" (UniqueName: \"kubernetes.io/projected/c77d9bca-3548-4c60-aa31-ad1a70dac2f1-kube-api-access-cz5qq\") pod \"telemetry-operator-controller-manager-78877dc965-4vmdw\" (UID: \"c77d9bca-3548-4c60-aa31-ad1a70dac2f1\") " pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.040385 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.043515 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-92r9n" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.064139 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.082597 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.082766 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74kb\" (UniqueName: \"kubernetes.io/projected/2f1da636-ea7f-4828-b896-ec1c81c92623-kube-api-access-w74kb\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.082831 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.082853 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkq7q\" (UniqueName: \"kubernetes.io/projected/35067b62-32eb-4cb2-8fbd-91b82c4a38cb-kube-api-access-jkq7q\") pod \"test-operator-controller-manager-5c5cb9c4d7-9w6lf\" (UID: \"35067b62-32eb-4cb2-8fbd-91b82c4a38cb\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.082885 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwptb\" (UniqueName: \"kubernetes.io/projected/e54e9cfd-b6fe-4b00-a12f-20b153f05710-kube-api-access-rwptb\") pod \"watcher-operator-controller-manager-6c4d75f7f9-fv2wb\" (UID: \"e54e9cfd-b6fe-4b00-a12f-20b153f05710\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.100939 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.109224 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwptb\" (UniqueName: \"kubernetes.io/projected/e54e9cfd-b6fe-4b00-a12f-20b153f05710-kube-api-access-rwptb\") pod \"watcher-operator-controller-manager-6c4d75f7f9-fv2wb\" (UID: \"e54e9cfd-b6fe-4b00-a12f-20b153f05710\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.109713 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkq7q\" (UniqueName: \"kubernetes.io/projected/35067b62-32eb-4cb2-8fbd-91b82c4a38cb-kube-api-access-jkq7q\") pod \"test-operator-controller-manager-5c5cb9c4d7-9w6lf\" (UID: \"35067b62-32eb-4cb2-8fbd-91b82c4a38cb\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.118117 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.169639 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.183945 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.184174 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.184228 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w74kb\" (UniqueName: \"kubernetes.io/projected/2f1da636-ea7f-4828-b896-ec1c81c92623-kube-api-access-w74kb\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.184279 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzqdc\" (UniqueName: \"kubernetes.io/projected/fea29376-0fd1-419c-ae47-68b1c7a355e3-kube-api-access-gzqdc\") pod \"rabbitmq-cluster-operator-manager-668c99d594-smt77\" (UID: \"fea29376-0fd1-419c-ae47-68b1c7a355e3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.184309 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.184434 4918 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.184479 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs podName:2f1da636-ea7f-4828-b896-ec1c81c92623 nodeName:}" failed. No retries permitted until 2026-03-19 16:57:14.684463913 +0000 UTC m=+1046.806663161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs") pod "openstack-operator-controller-manager-6c7d9f85c5-vtcc8" (UID: "2f1da636-ea7f-4828-b896-ec1c81c92623") : secret "metrics-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.184740 4918 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.184803 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs podName:2f1da636-ea7f-4828-b896-ec1c81c92623 nodeName:}" failed. No retries permitted until 2026-03-19 16:57:14.684783292 +0000 UTC m=+1046.806982600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs") pod "openstack-operator-controller-manager-6c7d9f85c5-vtcc8" (UID: "2f1da636-ea7f-4828-b896-ec1c81c92623") : secret "webhook-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.208550 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.215736 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74kb\" (UniqueName: \"kubernetes.io/projected/2f1da636-ea7f-4828-b896-ec1c81c92623-kube-api-access-w74kb\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.227195 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.286833 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzqdc\" (UniqueName: \"kubernetes.io/projected/fea29376-0fd1-419c-ae47-68b1c7a355e3-kube-api-access-gzqdc\") pod \"rabbitmq-cluster-operator-manager-668c99d594-smt77\" (UID: \"fea29376-0fd1-419c-ae47-68b1c7a355e3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.299677 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.314218 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.342645 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzqdc\" (UniqueName: \"kubernetes.io/projected/fea29376-0fd1-419c-ae47-68b1c7a355e3-kube-api-access-gzqdc\") pod \"rabbitmq-cluster-operator-manager-668c99d594-smt77\" (UID: \"fea29376-0fd1-419c-ae47-68b1c7a355e3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77" Mar 19 16:57:14 crc kubenswrapper[4918]: W0319 16:57:14.387926 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2644d9c5_c386_4d63_9cf9_7517f4fd6cb0.slice/crio-6c3d5b58e30eb9975cd4776ba769266cd10686db2a8b3253c548d96613df0752 WatchSource:0}: Error finding container 6c3d5b58e30eb9975cd4776ba769266cd10686db2a8b3253c548d96613df0752: Status 404 returned error can't find the container with id 6c3d5b58e30eb9975cd4776ba769266cd10686db2a8b3253c548d96613df0752 Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.388687 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-c98pr\" (UID: \"d06a3c13-6323-4fef-9aec-101be98e242b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.388898 4918 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.389001 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert podName:d06a3c13-6323-4fef-9aec-101be98e242b nodeName:}" failed. No retries permitted until 2026-03-19 16:57:15.388978875 +0000 UTC m=+1047.511178123 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-c98pr" (UID: "d06a3c13-6323-4fef-9aec-101be98e242b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.417752 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.472085 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.602082 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p" event={"ID":"cf3083b2-86ce-4d01-97c5-9005f683ff62","Type":"ContainerStarted","Data":"c5391eb61b7e936fafb86eb387eebad46f164c1721df7f6160b3b5b5f6150c27"} Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.602971 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs" event={"ID":"2644d9c5-c386-4d63-9cf9-7517f4fd6cb0","Type":"ContainerStarted","Data":"6c3d5b58e30eb9975cd4776ba769266cd10686db2a8b3253c548d96613df0752"} Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.603706 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b" event={"ID":"e8c75c9e-0913-485c-a5fe-9c9bf6e4bc53","Type":"ContainerStarted","Data":"3ec0325ed8a3206289aa09d89b3be007fcd016c1100a74a1f09a3dc0da6a29ec"} Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.710598 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.710960 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.711156 4918 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.711213 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs podName:2f1da636-ea7f-4828-b896-ec1c81c92623 nodeName:}" failed. No retries permitted until 2026-03-19 16:57:15.711194975 +0000 UTC m=+1047.833394233 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs") pod "openstack-operator-controller-manager-6c7d9f85c5-vtcc8" (UID: "2f1da636-ea7f-4828-b896-ec1c81c92623") : secret "metrics-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.713123 4918 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.713163 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs podName:2f1da636-ea7f-4828-b896-ec1c81c92623 nodeName:}" failed. No retries permitted until 2026-03-19 16:57:15.713152228 +0000 UTC m=+1047.835351476 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs") pod "openstack-operator-controller-manager-6c7d9f85c5-vtcc8" (UID: "2f1da636-ea7f-4828-b896-ec1c81c92623") : secret "webhook-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.790161 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.810399 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.812389 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert\") pod \"infra-operator-controller-manager-7b9c774f96-msx8p\" (UID: \"3d624150-7673-43db-b503-ec532c7c00ca\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.812545 4918 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: E0319 16:57:14.812611 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert podName:3d624150-7673-43db-b503-ec532c7c00ca nodeName:}" failed. No retries permitted until 2026-03-19 16:57:16.812590296 +0000 UTC m=+1048.934789614 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert") pod "infra-operator-controller-manager-7b9c774f96-msx8p" (UID: "3d624150-7673-43db-b503-ec532c7c00ca") : secret "infra-operator-webhook-server-cert" not found Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.827152 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.838171 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.844082 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.850404 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.890425 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.896489 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl"] Mar 19 16:57:14 crc kubenswrapper[4918]: I0319 16:57:14.901606 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5"] Mar 19 16:57:14 crc kubenswrapper[4918]: W0319 16:57:14.903397 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod230477fa_ce49_4e4d_a0a0_5bf2538c5192.slice/crio-04638d51c1c9f14109628167c77a8fb2ed4496ec89cbd9476aecac3c00b1a856 WatchSource:0}: Error finding container 04638d51c1c9f14109628167c77a8fb2ed4496ec89cbd9476aecac3c00b1a856: Status 404 returned error can't find the container with id 04638d51c1c9f14109628167c77a8fb2ed4496ec89cbd9476aecac3c00b1a856 Mar 19 16:57:14 crc kubenswrapper[4918]: W0319 16:57:14.905270 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9f74e44_e78b_4b23_b409_89af31c2dc82.slice/crio-9fc9e7f0d7dd944fbd1abccb628582d9e1afc7b6cecbcd5355530bb9a54ce12f WatchSource:0}: Error finding container 9fc9e7f0d7dd944fbd1abccb628582d9e1afc7b6cecbcd5355530bb9a54ce12f: Status 404 returned error can't find the container with id 9fc9e7f0d7dd944fbd1abccb628582d9e1afc7b6cecbcd5355530bb9a54ce12f Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.208238 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58"] Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.225774 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-svspw"] Mar 19 16:57:15 crc kubenswrapper[4918]: W0319 16:57:15.228953 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85790727_be18_4730_81e1_84022d4cead2.slice/crio-c7cb73d46915e846fb98b7125e58123b31b3bae672beda25019f6c58fe3f4bd4 WatchSource:0}: Error finding container c7cb73d46915e846fb98b7125e58123b31b3bae672beda25019f6c58fe3f4bd4: Status 404 returned error can't find the container with id c7cb73d46915e846fb98b7125e58123b31b3bae672beda25019f6c58fe3f4bd4 Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.235831 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk"] Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.245027 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw"] Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.246536 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jkq7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-9w6lf_openstack-operators(35067b62-32eb-4cb2-8fbd-91b82c4a38cb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.248536 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" podUID="35067b62-32eb-4cb2-8fbd-91b82c4a38cb" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.256212 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf"] Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.260049 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7ghpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-26r58_openstack-operators(7be38652-3021-4349-be08-4759ee13141b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.261191 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" podUID="7be38652-3021-4349-be08-4759ee13141b" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.265711 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb"] Mar 19 16:57:15 crc kubenswrapper[4918]: W0319 16:57:15.276941 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d031265_e265_412e_931e_52ca3bb940b6.slice/crio-ab6016f41be20e3832c7afd36c439cf47b5d08d603ffb70d4f618de3c9790f0a WatchSource:0}: Error finding container ab6016f41be20e3832c7afd36c439cf47b5d08d603ffb70d4f618de3c9790f0a: Status 404 returned error can't find the container with id ab6016f41be20e3832c7afd36c439cf47b5d08d603ffb70d4f618de3c9790f0a Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.279503 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lpqc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-6jtnt_openstack-operators(6d031265-e265-412e-931e-52ca3bb940b6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.280078 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xgx6q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-8sxgk_openstack-operators(f0824526-4903-49f0-bfcb-17298cc84eb6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.280264 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt"] Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.280966 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" podUID="6d031265-e265-412e-931e-52ca3bb940b6" Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.282413 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" podUID="f0824526-4903-49f0-bfcb-17298cc84eb6" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.288672 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77"] Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.298467 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rwptb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-fv2wb_openstack-operators(e54e9cfd-b6fe-4b00-a12f-20b153f05710): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.298629 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gzqdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-smt77_openstack-operators(fea29376-0fd1-419c-ae47-68b1c7a355e3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.306745 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77" podUID="fea29376-0fd1-419c-ae47-68b1c7a355e3" Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.306749 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" podUID="e54e9cfd-b6fe-4b00-a12f-20b153f05710" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.424303 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-c98pr\" (UID: \"d06a3c13-6323-4fef-9aec-101be98e242b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.424425 4918 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.424482 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert podName:d06a3c13-6323-4fef-9aec-101be98e242b nodeName:}" failed. No retries permitted until 2026-03-19 16:57:17.424467006 +0000 UTC m=+1049.546666254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-c98pr" (UID: "d06a3c13-6323-4fef-9aec-101be98e242b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.614427 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq" event={"ID":"b8633109-a56d-4535-b603-f75c257cb093","Type":"ContainerStarted","Data":"4114e4a63fd52ffcb1be02ab62c8e43b7c18da6f17f67ab9df54455da4563a9b"} Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.616547 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5" event={"ID":"34d2b03c-e63e-425a-a48a-6c9c97508add","Type":"ContainerStarted","Data":"8477c3801abcf389342931bfc93b16d7aefb5f50447a98798500a103b3362d0d"} Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.618158 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" event={"ID":"6d031265-e265-412e-931e-52ca3bb940b6","Type":"ContainerStarted","Data":"ab6016f41be20e3832c7afd36c439cf47b5d08d603ffb70d4f618de3c9790f0a"} Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.628501 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" podUID="6d031265-e265-412e-931e-52ca3bb940b6" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.635026 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" event={"ID":"7be38652-3021-4349-be08-4759ee13141b","Type":"ContainerStarted","Data":"54a4215a8dea7619f6832c25cdf5bc5525466e12c911c6eb127f43134b002c57"} Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.637012 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" podUID="7be38652-3021-4349-be08-4759ee13141b" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.641976 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" event={"ID":"85790727-be18-4730-81e1-84022d4cead2","Type":"ContainerStarted","Data":"c7cb73d46915e846fb98b7125e58123b31b3bae672beda25019f6c58fe3f4bd4"} Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.646907 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" event={"ID":"e54e9cfd-b6fe-4b00-a12f-20b153f05710","Type":"ContainerStarted","Data":"3dedfa8276909131f4358ba0aa296308d4d3d9fcdbe322929949214c49a6ea0b"} Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.653708 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" podUID="e54e9cfd-b6fe-4b00-a12f-20b153f05710" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.654207 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77" event={"ID":"fea29376-0fd1-419c-ae47-68b1c7a355e3","Type":"ContainerStarted","Data":"3be07110304bbb6884d8b2c18d63c4b9e20ad5976a5f57fb5c63cc41664cd0f4"} Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.655379 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77" podUID="fea29376-0fd1-419c-ae47-68b1c7a355e3" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.655703 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" event={"ID":"35067b62-32eb-4cb2-8fbd-91b82c4a38cb","Type":"ContainerStarted","Data":"a33bdb98aa19752737d5e3957da634e9ccc9c80000255dfafc31ee25f08ddb9f"} Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.657216 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" podUID="35067b62-32eb-4cb2-8fbd-91b82c4a38cb" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.664283 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k" event={"ID":"4ed700e4-0a35-4c6c-b57a-cde49d5f816c","Type":"ContainerStarted","Data":"00ff42334f51a2dbe76354a386e5c9d20d4bc90a6a4e452682f2daf65be08f89"} Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.666633 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck" event={"ID":"c2eafc06-6df4-440d-820f-aad17b6061d7","Type":"ContainerStarted","Data":"1be0cde1740586a668d0dea084a5cdb0d1d274f1239fa17cff7c8e53d8f74a48"} Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.668057 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw" event={"ID":"230477fa-ce49-4e4d-a0a0-5bf2538c5192","Type":"ContainerStarted","Data":"04638d51c1c9f14109628167c77a8fb2ed4496ec89cbd9476aecac3c00b1a856"} Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.684283 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj" event={"ID":"b059bb35-4870-4982-b62f-e70ffd0270d2","Type":"ContainerStarted","Data":"57fbfcbda70b58d6cc542564c573faa9b98f22c10d9fcbcb9d5a735a516aa9cc"} Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.686025 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl" event={"ID":"64edd3c9-61ea-4fc0-9a74-95a27c4bffc9","Type":"ContainerStarted","Data":"c4ef4f9f68679a540c6093832a4c15e7a3c4dcdc8bbfdc7280528897edf040ca"} Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.690193 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5" event={"ID":"e9f74e44-e78b-4b23-b409-89af31c2dc82","Type":"ContainerStarted","Data":"9fc9e7f0d7dd944fbd1abccb628582d9e1afc7b6cecbcd5355530bb9a54ce12f"} Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.693631 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" event={"ID":"f0824526-4903-49f0-bfcb-17298cc84eb6","Type":"ContainerStarted","Data":"68be9e58974facd395c070553bcd918f231c2440b28d9211a2af9c3d8189948e"} Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.695916 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" podUID="f0824526-4903-49f0-bfcb-17298cc84eb6" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.696642 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" event={"ID":"c77d9bca-3548-4c60-aa31-ad1a70dac2f1","Type":"ContainerStarted","Data":"e73c8d5c13bd3a13f86086991999e8a8b876b6098e1d240612e054a0f92a742a"} Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.698943 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" event={"ID":"d9f56510-e25f-4de5-85b6-3030e989d13d","Type":"ContainerStarted","Data":"494cb35b18a3075a9d168efb3948dabcd3162a66cb8c6859e21dd52f2e29b333"} Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.730412 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:15 crc kubenswrapper[4918]: I0319 16:57:15.730594 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.732236 4918 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.732300 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs podName:2f1da636-ea7f-4828-b896-ec1c81c92623 nodeName:}" failed. No retries permitted until 2026-03-19 16:57:17.732281182 +0000 UTC m=+1049.854480510 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs") pod "openstack-operator-controller-manager-6c7d9f85c5-vtcc8" (UID: "2f1da636-ea7f-4828-b896-ec1c81c92623") : secret "webhook-server-cert" not found Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.733127 4918 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 16:57:15 crc kubenswrapper[4918]: E0319 16:57:15.733181 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs podName:2f1da636-ea7f-4828-b896-ec1c81c92623 nodeName:}" failed. No retries permitted until 2026-03-19 16:57:17.733168817 +0000 UTC m=+1049.855368065 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs") pod "openstack-operator-controller-manager-6c7d9f85c5-vtcc8" (UID: "2f1da636-ea7f-4828-b896-ec1c81c92623") : secret "metrics-server-cert" not found Mar 19 16:57:16 crc kubenswrapper[4918]: E0319 16:57:16.713961 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" podUID="e54e9cfd-b6fe-4b00-a12f-20b153f05710" Mar 19 16:57:16 crc kubenswrapper[4918]: E0319 16:57:16.714334 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" podUID="35067b62-32eb-4cb2-8fbd-91b82c4a38cb" Mar 19 16:57:16 crc kubenswrapper[4918]: E0319 16:57:16.714433 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77" podUID="fea29376-0fd1-419c-ae47-68b1c7a355e3" Mar 19 16:57:16 crc kubenswrapper[4918]: E0319 16:57:16.714886 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" podUID="6d031265-e265-412e-931e-52ca3bb940b6" Mar 19 16:57:16 crc kubenswrapper[4918]: E0319 16:57:16.719198 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" podUID="f0824526-4903-49f0-bfcb-17298cc84eb6" Mar 19 16:57:16 crc kubenswrapper[4918]: E0319 16:57:16.719683 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" podUID="7be38652-3021-4349-be08-4759ee13141b" Mar 19 16:57:16 crc kubenswrapper[4918]: I0319 16:57:16.858044 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert\") pod \"infra-operator-controller-manager-7b9c774f96-msx8p\" (UID: \"3d624150-7673-43db-b503-ec532c7c00ca\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:16 crc kubenswrapper[4918]: E0319 16:57:16.859280 4918 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 16:57:16 crc kubenswrapper[4918]: E0319 16:57:16.859319 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert podName:3d624150-7673-43db-b503-ec532c7c00ca nodeName:}" failed. No retries permitted until 2026-03-19 16:57:20.859306507 +0000 UTC m=+1052.981505755 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert") pod "infra-operator-controller-manager-7b9c774f96-msx8p" (UID: "3d624150-7673-43db-b503-ec532c7c00ca") : secret "infra-operator-webhook-server-cert" not found Mar 19 16:57:17 crc kubenswrapper[4918]: I0319 16:57:17.471434 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-c98pr\" (UID: \"d06a3c13-6323-4fef-9aec-101be98e242b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:17 crc kubenswrapper[4918]: E0319 16:57:17.471651 4918 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 16:57:17 crc kubenswrapper[4918]: E0319 16:57:17.471732 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert podName:d06a3c13-6323-4fef-9aec-101be98e242b nodeName:}" failed. No retries permitted until 2026-03-19 16:57:21.471714652 +0000 UTC m=+1053.593913890 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-c98pr" (UID: "d06a3c13-6323-4fef-9aec-101be98e242b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 16:57:17 crc kubenswrapper[4918]: I0319 16:57:17.776429 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:17 crc kubenswrapper[4918]: I0319 16:57:17.776563 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:17 crc kubenswrapper[4918]: E0319 16:57:17.776618 4918 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 16:57:17 crc kubenswrapper[4918]: E0319 16:57:17.776680 4918 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 16:57:17 crc kubenswrapper[4918]: E0319 16:57:17.776711 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs podName:2f1da636-ea7f-4828-b896-ec1c81c92623 nodeName:}" failed. No retries permitted until 2026-03-19 16:57:21.776687149 +0000 UTC m=+1053.898886427 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs") pod "openstack-operator-controller-manager-6c7d9f85c5-vtcc8" (UID: "2f1da636-ea7f-4828-b896-ec1c81c92623") : secret "webhook-server-cert" not found Mar 19 16:57:17 crc kubenswrapper[4918]: E0319 16:57:17.776736 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs podName:2f1da636-ea7f-4828-b896-ec1c81c92623 nodeName:}" failed. No retries permitted until 2026-03-19 16:57:21.77672457 +0000 UTC m=+1053.898923858 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs") pod "openstack-operator-controller-manager-6c7d9f85c5-vtcc8" (UID: "2f1da636-ea7f-4828-b896-ec1c81c92623") : secret "metrics-server-cert" not found Mar 19 16:57:20 crc kubenswrapper[4918]: I0319 16:57:20.923409 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert\") pod \"infra-operator-controller-manager-7b9c774f96-msx8p\" (UID: \"3d624150-7673-43db-b503-ec532c7c00ca\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:20 crc kubenswrapper[4918]: E0319 16:57:20.923617 4918 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 16:57:20 crc kubenswrapper[4918]: E0319 16:57:20.923694 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert podName:3d624150-7673-43db-b503-ec532c7c00ca nodeName:}" failed. No retries permitted until 2026-03-19 16:57:28.923676883 +0000 UTC m=+1061.045876131 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert") pod "infra-operator-controller-manager-7b9c774f96-msx8p" (UID: "3d624150-7673-43db-b503-ec532c7c00ca") : secret "infra-operator-webhook-server-cert" not found Mar 19 16:57:21 crc kubenswrapper[4918]: I0319 16:57:21.536638 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-c98pr\" (UID: \"d06a3c13-6323-4fef-9aec-101be98e242b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:21 crc kubenswrapper[4918]: E0319 16:57:21.536857 4918 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 16:57:21 crc kubenswrapper[4918]: E0319 16:57:21.536923 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert podName:d06a3c13-6323-4fef-9aec-101be98e242b nodeName:}" failed. No retries permitted until 2026-03-19 16:57:29.53690704 +0000 UTC m=+1061.659106298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-c98pr" (UID: "d06a3c13-6323-4fef-9aec-101be98e242b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 16:57:21 crc kubenswrapper[4918]: I0319 16:57:21.841569 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:21 crc kubenswrapper[4918]: E0319 16:57:21.841725 4918 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 16:57:21 crc kubenswrapper[4918]: I0319 16:57:21.841749 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:21 crc kubenswrapper[4918]: E0319 16:57:21.841792 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs podName:2f1da636-ea7f-4828-b896-ec1c81c92623 nodeName:}" failed. No retries permitted until 2026-03-19 16:57:29.841768495 +0000 UTC m=+1061.963967743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs") pod "openstack-operator-controller-manager-6c7d9f85c5-vtcc8" (UID: "2f1da636-ea7f-4828-b896-ec1c81c92623") : secret "metrics-server-cert" not found Mar 19 16:57:21 crc kubenswrapper[4918]: E0319 16:57:21.841875 4918 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 16:57:21 crc kubenswrapper[4918]: E0319 16:57:21.841911 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs podName:2f1da636-ea7f-4828-b896-ec1c81c92623 nodeName:}" failed. No retries permitted until 2026-03-19 16:57:29.841900169 +0000 UTC m=+1061.964099427 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs") pod "openstack-operator-controller-manager-6c7d9f85c5-vtcc8" (UID: "2f1da636-ea7f-4828-b896-ec1c81c92623") : secret "webhook-server-cert" not found Mar 19 16:57:28 crc kubenswrapper[4918]: I0319 16:57:28.998258 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert\") pod \"infra-operator-controller-manager-7b9c774f96-msx8p\" (UID: \"3d624150-7673-43db-b503-ec532c7c00ca\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:29 crc kubenswrapper[4918]: I0319 16:57:29.005145 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d624150-7673-43db-b503-ec532c7c00ca-cert\") pod \"infra-operator-controller-manager-7b9c774f96-msx8p\" (UID: \"3d624150-7673-43db-b503-ec532c7c00ca\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:29 crc kubenswrapper[4918]: I0319 16:57:29.017998 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:29 crc kubenswrapper[4918]: I0319 16:57:29.607766 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-c98pr\" (UID: \"d06a3c13-6323-4fef-9aec-101be98e242b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:29 crc kubenswrapper[4918]: I0319 16:57:29.622265 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06a3c13-6323-4fef-9aec-101be98e242b-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-c98pr\" (UID: \"d06a3c13-6323-4fef-9aec-101be98e242b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:29 crc kubenswrapper[4918]: I0319 16:57:29.858084 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:29 crc kubenswrapper[4918]: I0319 16:57:29.911850 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:29 crc kubenswrapper[4918]: I0319 16:57:29.911937 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:29 crc kubenswrapper[4918]: E0319 16:57:29.912067 4918 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 16:57:29 crc kubenswrapper[4918]: E0319 16:57:29.912124 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs podName:2f1da636-ea7f-4828-b896-ec1c81c92623 nodeName:}" failed. No retries permitted until 2026-03-19 16:57:45.91210741 +0000 UTC m=+1078.034306658 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs") pod "openstack-operator-controller-manager-6c7d9f85c5-vtcc8" (UID: "2f1da636-ea7f-4828-b896-ec1c81c92623") : secret "webhook-server-cert" not found Mar 19 16:57:29 crc kubenswrapper[4918]: I0319 16:57:29.927482 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-metrics-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:30 crc kubenswrapper[4918]: E0319 16:57:30.615458 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3" Mar 19 16:57:30 crc kubenswrapper[4918]: E0319 16:57:30.615512 4918 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3" Mar 19 16:57:30 crc kubenswrapper[4918]: E0319 16:57:30.615654 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.75:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cz5qq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-78877dc965-4vmdw_openstack-operators(c77d9bca-3548-4c60-aa31-ad1a70dac2f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 16:57:30 crc kubenswrapper[4918]: E0319 16:57:30.616984 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" podUID="c77d9bca-3548-4c60-aa31-ad1a70dac2f1" Mar 19 16:57:30 crc kubenswrapper[4918]: E0319 16:57:30.835139 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.75:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" podUID="c77d9bca-3548-4c60-aa31-ad1a70dac2f1" Mar 19 16:57:31 crc kubenswrapper[4918]: E0319 16:57:31.328823 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55" Mar 19 16:57:31 crc kubenswrapper[4918]: E0319 16:57:31.329003 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4qb2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-svspw_openstack-operators(85790727-be18-4730-81e1-84022d4cead2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 16:57:31 crc kubenswrapper[4918]: E0319 16:57:31.330153 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" podUID="85790727-be18-4730-81e1-84022d4cead2" Mar 19 16:57:31 crc kubenswrapper[4918]: E0319 16:57:31.840930 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" podUID="85790727-be18-4730-81e1-84022d4cead2" Mar 19 16:57:31 crc kubenswrapper[4918]: E0319 16:57:31.869510 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 19 16:57:31 crc kubenswrapper[4918]: E0319 16:57:31.869739 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t8lfk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-w7rjd_openstack-operators(d9f56510-e25f-4de5-85b6-3030e989d13d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 16:57:31 crc kubenswrapper[4918]: E0319 16:57:31.871282 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" podUID="d9f56510-e25f-4de5-85b6-3030e989d13d" Mar 19 16:57:32 crc kubenswrapper[4918]: E0319 16:57:32.851672 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" podUID="d9f56510-e25f-4de5-85b6-3030e989d13d" Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.214080 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr"] Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.265443 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p"] Mar 19 16:57:36 crc kubenswrapper[4918]: W0319 16:57:36.340697 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d624150_7673_43db_b503_ec532c7c00ca.slice/crio-64973bd8ea008cec79d5780fae9c17794d4334826bc512d052858f0519e9c155 WatchSource:0}: Error finding container 64973bd8ea008cec79d5780fae9c17794d4334826bc512d052858f0519e9c155: Status 404 returned error can't find the container with id 64973bd8ea008cec79d5780fae9c17794d4334826bc512d052858f0519e9c155 Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.925789 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p" event={"ID":"cf3083b2-86ce-4d01-97c5-9005f683ff62","Type":"ContainerStarted","Data":"2692d8a2ff48d3fa7f2cf00751d072d7de30699024d7e8456178b0772a7d0026"} Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.926908 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p" Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.938436 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs" event={"ID":"2644d9c5-c386-4d63-9cf9-7517f4fd6cb0","Type":"ContainerStarted","Data":"164115f462e118967d7505b81a3af133591c25d10e65baf55b4d818beac8cc21"} Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.939167 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs" Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.950340 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" event={"ID":"3d624150-7673-43db-b503-ec532c7c00ca","Type":"ContainerStarted","Data":"64973bd8ea008cec79d5780fae9c17794d4334826bc512d052858f0519e9c155"} Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.957616 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p" podStartSLOduration=6.986697504 podStartE2EDuration="24.957569804s" podCreationTimestamp="2026-03-19 16:57:12 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.353471384 +0000 UTC m=+1046.475670632" lastFinishedPulling="2026-03-19 16:57:32.324343674 +0000 UTC m=+1064.446542932" observedRunningTime="2026-03-19 16:57:36.943796717 +0000 UTC m=+1069.065995965" watchObservedRunningTime="2026-03-19 16:57:36.957569804 +0000 UTC m=+1069.079769062" Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.965044 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw" Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.965786 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs" podStartSLOduration=7.028756925 podStartE2EDuration="23.965768387s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.381774198 +0000 UTC m=+1046.503973446" lastFinishedPulling="2026-03-19 16:57:31.31878566 +0000 UTC m=+1063.440984908" observedRunningTime="2026-03-19 16:57:36.964182794 +0000 UTC m=+1069.086382042" watchObservedRunningTime="2026-03-19 16:57:36.965768387 +0000 UTC m=+1069.087967635" Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.967803 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj" event={"ID":"b059bb35-4870-4982-b62f-e70ffd0270d2","Type":"ContainerStarted","Data":"8eb03d4dd1034e8391e9ee36480637dc001f268e5bfcaa3348e31e0149be4e71"} Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.969072 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj" Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.985030 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl" event={"ID":"64edd3c9-61ea-4fc0-9a74-95a27c4bffc9","Type":"ContainerStarted","Data":"c7b45cfcc9da70e38d904684be097b27443c159ab5f69f4549ffc73ae53d01de"} Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.985340 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl" Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.992195 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5" event={"ID":"34d2b03c-e63e-425a-a48a-6c9c97508add","Type":"ContainerStarted","Data":"d895032f59bbc9c9a3ed5169d664a42d91617eaab8ee7eefee69ae2391294751"} Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.992934 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5" Mar 19 16:57:36 crc kubenswrapper[4918]: I0319 16:57:36.993180 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw" podStartSLOduration=4.885042093 podStartE2EDuration="23.993166227s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.906935817 +0000 UTC m=+1047.029135065" lastFinishedPulling="2026-03-19 16:57:34.015059951 +0000 UTC m=+1066.137259199" observedRunningTime="2026-03-19 16:57:36.990259967 +0000 UTC m=+1069.112459215" watchObservedRunningTime="2026-03-19 16:57:36.993166227 +0000 UTC m=+1069.115365475" Mar 19 16:57:37 crc kubenswrapper[4918]: I0319 16:57:37.003023 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b" event={"ID":"e8c75c9e-0913-485c-a5fe-9c9bf6e4bc53","Type":"ContainerStarted","Data":"6fb07435eeaa37f9169dd9c00bb6562dac1900b90707da4772de23d20829ba93"} Mar 19 16:57:37 crc kubenswrapper[4918]: I0319 16:57:37.003751 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b" Mar 19 16:57:37 crc kubenswrapper[4918]: I0319 16:57:37.015841 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl" podStartSLOduration=5.421401027 podStartE2EDuration="24.015827086s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.916490518 +0000 UTC m=+1047.038689766" lastFinishedPulling="2026-03-19 16:57:33.510916577 +0000 UTC m=+1065.633115825" observedRunningTime="2026-03-19 16:57:37.012572817 +0000 UTC m=+1069.134772065" watchObservedRunningTime="2026-03-19 16:57:37.015827086 +0000 UTC m=+1069.138026334" Mar 19 16:57:37 crc kubenswrapper[4918]: I0319 16:57:37.016790 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" event={"ID":"d06a3c13-6323-4fef-9aec-101be98e242b","Type":"ContainerStarted","Data":"d6e8bc7e7d65b30a87f24a33a56fef00a5bb109e9941337addb6e51fb6d85432"} Mar 19 16:57:37 crc kubenswrapper[4918]: I0319 16:57:37.038945 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq" event={"ID":"b8633109-a56d-4535-b603-f75c257cb093","Type":"ContainerStarted","Data":"419fc644fd716ae2c4673ffafcc8087df2b7482b0ec36901010b9fcaa07a576d"} Mar 19 16:57:37 crc kubenswrapper[4918]: I0319 16:57:37.040484 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq" Mar 19 16:57:37 crc kubenswrapper[4918]: I0319 16:57:37.077470 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj" podStartSLOduration=5.8783007959999996 podStartE2EDuration="25.077450861s" podCreationTimestamp="2026-03-19 16:57:12 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.816682378 +0000 UTC m=+1046.938881626" lastFinishedPulling="2026-03-19 16:57:34.015832433 +0000 UTC m=+1066.138031691" observedRunningTime="2026-03-19 16:57:37.040606754 +0000 UTC m=+1069.162806002" watchObservedRunningTime="2026-03-19 16:57:37.077450861 +0000 UTC m=+1069.199650109" Mar 19 16:57:37 crc kubenswrapper[4918]: I0319 16:57:37.079649 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq" podStartSLOduration=5.394366608 podStartE2EDuration="24.079640311s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.826314312 +0000 UTC m=+1046.948513560" lastFinishedPulling="2026-03-19 16:57:33.511588015 +0000 UTC m=+1065.633787263" observedRunningTime="2026-03-19 16:57:37.072327231 +0000 UTC m=+1069.194526479" watchObservedRunningTime="2026-03-19 16:57:37.079640311 +0000 UTC m=+1069.201839559" Mar 19 16:57:37 crc kubenswrapper[4918]: I0319 16:57:37.094405 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b" podStartSLOduration=4.652589535 podStartE2EDuration="24.094386824s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.574033024 +0000 UTC m=+1046.696232272" lastFinishedPulling="2026-03-19 16:57:34.015830313 +0000 UTC m=+1066.138029561" observedRunningTime="2026-03-19 16:57:37.091153496 +0000 UTC m=+1069.213352744" watchObservedRunningTime="2026-03-19 16:57:37.094386824 +0000 UTC m=+1069.216586072" Mar 19 16:57:37 crc kubenswrapper[4918]: I0319 16:57:37.128492 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5" podStartSLOduration=5.429251912 podStartE2EDuration="24.128477476s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.811646951 +0000 UTC m=+1046.933846199" lastFinishedPulling="2026-03-19 16:57:33.510872515 +0000 UTC m=+1065.633071763" observedRunningTime="2026-03-19 16:57:37.123476149 +0000 UTC m=+1069.245675397" watchObservedRunningTime="2026-03-19 16:57:37.128477476 +0000 UTC m=+1069.250676724" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.054980 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck" event={"ID":"c2eafc06-6df4-440d-820f-aad17b6061d7","Type":"ContainerStarted","Data":"78234b69bec1b48f11ab8387d5484cb2f52ba983e7e9ab02ba67657cdcb1a637"} Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.055093 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.061025 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" event={"ID":"6d031265-e265-412e-931e-52ca3bb940b6","Type":"ContainerStarted","Data":"c7bf881fa614fbb67a9025f45fade59849ec6503ffa87ad32bf2d570904b55ff"} Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.061346 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.069563 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77" event={"ID":"fea29376-0fd1-419c-ae47-68b1c7a355e3","Type":"ContainerStarted","Data":"025cca764e436f1e4860a3612be211e89d9c82d0560f7d46f836261c0f43758d"} Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.074726 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k" event={"ID":"4ed700e4-0a35-4c6c-b57a-cde49d5f816c","Type":"ContainerStarted","Data":"ab6c0970790e2c7b167495c14844d05c427579d16ff4dbe2fac004d2c937f255"} Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.074988 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.081656 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck" podStartSLOduration=5.893950736 podStartE2EDuration="25.081639426s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.828240775 +0000 UTC m=+1046.950440023" lastFinishedPulling="2026-03-19 16:57:34.015929465 +0000 UTC m=+1066.138128713" observedRunningTime="2026-03-19 16:57:38.073202486 +0000 UTC m=+1070.195401734" watchObservedRunningTime="2026-03-19 16:57:38.081639426 +0000 UTC m=+1070.203838674" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.086193 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" event={"ID":"35067b62-32eb-4cb2-8fbd-91b82c4a38cb","Type":"ContainerStarted","Data":"02217546b42ef7353c9655bfd19d5bd335531c988dc9fbddfd0fe0ef86e4d666"} Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.086425 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.089469 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw" event={"ID":"230477fa-ce49-4e4d-a0a0-5bf2538c5192","Type":"ContainerStarted","Data":"ec02637a844b56ff36fea2ac68a88aac2e608db4eea46646a2f68dcb37640223"} Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.092495 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" event={"ID":"7be38652-3021-4349-be08-4759ee13141b","Type":"ContainerStarted","Data":"a6c621509abad7511dbc99f6f2e667ef7d61f80006e252bef916294b3ef52469"} Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.092779 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.093985 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5" event={"ID":"e9f74e44-e78b-4b23-b409-89af31c2dc82","Type":"ContainerStarted","Data":"0f58bb0705b9e507483965abfed5a71f371e9aaf1474dcf1c3c508a1450e68eb"} Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.094637 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.106885 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" event={"ID":"f0824526-4903-49f0-bfcb-17298cc84eb6","Type":"ContainerStarted","Data":"c2fe6b6bdc84a0401dbe298987dbaf21af2cdc99b71b28d31ae9467142901a72"} Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.107919 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.133372 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-smt77" podStartSLOduration=3.80717457 podStartE2EDuration="25.13335478s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:15.298555603 +0000 UTC m=+1047.420754851" lastFinishedPulling="2026-03-19 16:57:36.624735793 +0000 UTC m=+1068.746935061" observedRunningTime="2026-03-19 16:57:38.12712282 +0000 UTC m=+1070.249322068" watchObservedRunningTime="2026-03-19 16:57:38.13335478 +0000 UTC m=+1070.255554028" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.152840 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" podStartSLOduration=4.081336938 podStartE2EDuration="25.152822543s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:15.279280907 +0000 UTC m=+1047.401480155" lastFinishedPulling="2026-03-19 16:57:36.350766512 +0000 UTC m=+1068.472965760" observedRunningTime="2026-03-19 16:57:38.148409792 +0000 UTC m=+1070.270609040" watchObservedRunningTime="2026-03-19 16:57:38.152822543 +0000 UTC m=+1070.275021791" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.154903 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" event={"ID":"e54e9cfd-b6fe-4b00-a12f-20b153f05710","Type":"ContainerStarted","Data":"afdebce828a74d2e02cc66c04b9fe53e5c2608a809a3607b36af86d48c179fa5"} Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.155333 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.190014 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5" podStartSLOduration=7.081766271 podStartE2EDuration="26.189993459s" podCreationTimestamp="2026-03-19 16:57:12 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.907616195 +0000 UTC m=+1047.029815443" lastFinishedPulling="2026-03-19 16:57:34.015843383 +0000 UTC m=+1066.138042631" observedRunningTime="2026-03-19 16:57:38.184731555 +0000 UTC m=+1070.306930823" watchObservedRunningTime="2026-03-19 16:57:38.189993459 +0000 UTC m=+1070.312192707" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.212153 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" podStartSLOduration=4.137768991 podStartE2EDuration="25.212112644s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:15.276599114 +0000 UTC m=+1047.398798362" lastFinishedPulling="2026-03-19 16:57:36.350942757 +0000 UTC m=+1068.473142015" observedRunningTime="2026-03-19 16:57:38.207862198 +0000 UTC m=+1070.330061436" watchObservedRunningTime="2026-03-19 16:57:38.212112644 +0000 UTC m=+1070.334311892" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.231989 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" podStartSLOduration=4.142462638 podStartE2EDuration="25.231974187s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:15.259229908 +0000 UTC m=+1047.381429156" lastFinishedPulling="2026-03-19 16:57:36.348741457 +0000 UTC m=+1068.470940705" observedRunningTime="2026-03-19 16:57:38.227067903 +0000 UTC m=+1070.349267151" watchObservedRunningTime="2026-03-19 16:57:38.231974187 +0000 UTC m=+1070.354173435" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.244370 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" podStartSLOduration=4.790353593 podStartE2EDuration="25.244354646s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:15.246398938 +0000 UTC m=+1047.368598186" lastFinishedPulling="2026-03-19 16:57:35.700399991 +0000 UTC m=+1067.822599239" observedRunningTime="2026-03-19 16:57:38.243846451 +0000 UTC m=+1070.366045699" watchObservedRunningTime="2026-03-19 16:57:38.244354646 +0000 UTC m=+1070.366553894" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.271363 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k" podStartSLOduration=7.063006689 podStartE2EDuration="26.271345114s" podCreationTimestamp="2026-03-19 16:57:12 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.807633631 +0000 UTC m=+1046.929832879" lastFinishedPulling="2026-03-19 16:57:34.015972056 +0000 UTC m=+1066.138171304" observedRunningTime="2026-03-19 16:57:38.265510524 +0000 UTC m=+1070.387709772" watchObservedRunningTime="2026-03-19 16:57:38.271345114 +0000 UTC m=+1070.393544352" Mar 19 16:57:38 crc kubenswrapper[4918]: I0319 16:57:38.289736 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" podStartSLOduration=4.824393253 podStartE2EDuration="25.289717746s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:15.298331107 +0000 UTC m=+1047.420530355" lastFinishedPulling="2026-03-19 16:57:35.76365556 +0000 UTC m=+1067.885854848" observedRunningTime="2026-03-19 16:57:38.28438692 +0000 UTC m=+1070.406586188" watchObservedRunningTime="2026-03-19 16:57:38.289717746 +0000 UTC m=+1070.411916994" Mar 19 16:57:41 crc kubenswrapper[4918]: I0319 16:57:41.182775 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" event={"ID":"d06a3c13-6323-4fef-9aec-101be98e242b","Type":"ContainerStarted","Data":"dd1fb62ac5eeb81ff91f39d3ee8101d1f8ed77e0adc4cbba368d5446c1a7f22e"} Mar 19 16:57:41 crc kubenswrapper[4918]: I0319 16:57:41.183355 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:41 crc kubenswrapper[4918]: I0319 16:57:41.186230 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" event={"ID":"3d624150-7673-43db-b503-ec532c7c00ca","Type":"ContainerStarted","Data":"678430e726278c570f549c17deab336331e47b475d3e4bc1a932f4db498c6866"} Mar 19 16:57:41 crc kubenswrapper[4918]: I0319 16:57:41.186494 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:41 crc kubenswrapper[4918]: I0319 16:57:41.233177 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" podStartSLOduration=24.459159588 podStartE2EDuration="28.233151463s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:36.34627215 +0000 UTC m=+1068.468471398" lastFinishedPulling="2026-03-19 16:57:40.120264025 +0000 UTC m=+1072.242463273" observedRunningTime="2026-03-19 16:57:41.218438181 +0000 UTC m=+1073.340637449" watchObservedRunningTime="2026-03-19 16:57:41.233151463 +0000 UTC m=+1073.355350731" Mar 19 16:57:41 crc kubenswrapper[4918]: I0319 16:57:41.249629 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" podStartSLOduration=24.479558336 podStartE2EDuration="28.249614564s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:36.346037813 +0000 UTC m=+1068.468237061" lastFinishedPulling="2026-03-19 16:57:40.116094041 +0000 UTC m=+1072.238293289" observedRunningTime="2026-03-19 16:57:41.245511891 +0000 UTC m=+1073.367711139" watchObservedRunningTime="2026-03-19 16:57:41.249614564 +0000 UTC m=+1073.371813822" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.337946 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-ql64p" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.376021 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-xl6zs" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.381988 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-k2f4b" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.545165 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-sgwx5" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.545862 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6nq" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.576618 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-wg98k" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.602799 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-w78t5" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.605060 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5dbck" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.610214 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-nmkqj" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.724927 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-rkhtw" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.777804 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8tqxl" Mar 19 16:57:43 crc kubenswrapper[4918]: I0319 16:57:43.919139 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-26r58" Mar 19 16:57:44 crc kubenswrapper[4918]: I0319 16:57:44.122161 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6jtnt" Mar 19 16:57:44 crc kubenswrapper[4918]: I0319 16:57:44.172200 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-8sxgk" Mar 19 16:57:44 crc kubenswrapper[4918]: I0319 16:57:44.212461 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-9w6lf" Mar 19 16:57:44 crc kubenswrapper[4918]: I0319 16:57:44.230300 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-fv2wb" Mar 19 16:57:45 crc kubenswrapper[4918]: I0319 16:57:45.224542 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" event={"ID":"c77d9bca-3548-4c60-aa31-ad1a70dac2f1","Type":"ContainerStarted","Data":"a3c1002bdd19815f54913b6f68c16a889134898ca7c136ce64425e801939733d"} Mar 19 16:57:45 crc kubenswrapper[4918]: I0319 16:57:45.225108 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" Mar 19 16:57:45 crc kubenswrapper[4918]: I0319 16:57:45.246689 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" podStartSLOduration=2.58806149 podStartE2EDuration="32.246669739s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:15.24209572 +0000 UTC m=+1047.364294968" lastFinishedPulling="2026-03-19 16:57:44.900703929 +0000 UTC m=+1077.022903217" observedRunningTime="2026-03-19 16:57:45.245116416 +0000 UTC m=+1077.367315704" watchObservedRunningTime="2026-03-19 16:57:45.246669739 +0000 UTC m=+1077.368868997" Mar 19 16:57:45 crc kubenswrapper[4918]: I0319 16:57:45.967620 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:45 crc kubenswrapper[4918]: I0319 16:57:45.973877 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2f1da636-ea7f-4828-b896-ec1c81c92623-webhook-certs\") pod \"openstack-operator-controller-manager-6c7d9f85c5-vtcc8\" (UID: \"2f1da636-ea7f-4828-b896-ec1c81c92623\") " pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:46 crc kubenswrapper[4918]: I0319 16:57:46.093498 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:46 crc kubenswrapper[4918]: I0319 16:57:46.245941 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" event={"ID":"d9f56510-e25f-4de5-85b6-3030e989d13d","Type":"ContainerStarted","Data":"4c19165016727fe7bae749dba5a98a4e3584d9e6167dcb6b0620f2762b956528"} Mar 19 16:57:46 crc kubenswrapper[4918]: I0319 16:57:46.246569 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" Mar 19 16:57:46 crc kubenswrapper[4918]: I0319 16:57:46.261943 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" podStartSLOduration=2.677705941 podStartE2EDuration="33.261884486s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:14.828030829 +0000 UTC m=+1046.950230077" lastFinishedPulling="2026-03-19 16:57:45.412209354 +0000 UTC m=+1077.534408622" observedRunningTime="2026-03-19 16:57:46.260364144 +0000 UTC m=+1078.382563392" watchObservedRunningTime="2026-03-19 16:57:46.261884486 +0000 UTC m=+1078.384083734" Mar 19 16:57:46 crc kubenswrapper[4918]: I0319 16:57:46.618972 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8"] Mar 19 16:57:47 crc kubenswrapper[4918]: I0319 16:57:47.254104 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" event={"ID":"2f1da636-ea7f-4828-b896-ec1c81c92623","Type":"ContainerStarted","Data":"d7b54544665bdef3d232c1da805bfa7152db0beb80e0cb84dc8fbc5a23d7c8b5"} Mar 19 16:57:47 crc kubenswrapper[4918]: I0319 16:57:47.254396 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" event={"ID":"2f1da636-ea7f-4828-b896-ec1c81c92623","Type":"ContainerStarted","Data":"fc49a231fe46f170df114f8e1b4ba12af9af8a624bf6ca920f9817b467e258cb"} Mar 19 16:57:47 crc kubenswrapper[4918]: I0319 16:57:47.254416 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:57:47 crc kubenswrapper[4918]: I0319 16:57:47.288342 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" podStartSLOduration=34.28832452 podStartE2EDuration="34.28832452s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:57:47.282508321 +0000 UTC m=+1079.404707579" watchObservedRunningTime="2026-03-19 16:57:47.28832452 +0000 UTC m=+1079.410523778" Mar 19 16:57:48 crc kubenswrapper[4918]: I0319 16:57:48.263344 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" event={"ID":"85790727-be18-4730-81e1-84022d4cead2","Type":"ContainerStarted","Data":"ca5df3fe85bcfc18c54bfc852264df95163366dadf62e711537310b3d257ed01"} Mar 19 16:57:48 crc kubenswrapper[4918]: I0319 16:57:48.263887 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" Mar 19 16:57:48 crc kubenswrapper[4918]: I0319 16:57:48.280596 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" podStartSLOduration=2.485769794 podStartE2EDuration="35.28057733s" podCreationTimestamp="2026-03-19 16:57:13 +0000 UTC" firstStartedPulling="2026-03-19 16:57:15.231640685 +0000 UTC m=+1047.353839933" lastFinishedPulling="2026-03-19 16:57:48.026448211 +0000 UTC m=+1080.148647469" observedRunningTime="2026-03-19 16:57:48.276295373 +0000 UTC m=+1080.398494621" watchObservedRunningTime="2026-03-19 16:57:48.28057733 +0000 UTC m=+1080.402776578" Mar 19 16:57:49 crc kubenswrapper[4918]: I0319 16:57:49.025888 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-msx8p" Mar 19 16:57:49 crc kubenswrapper[4918]: I0319 16:57:49.867185 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-c98pr" Mar 19 16:57:53 crc kubenswrapper[4918]: I0319 16:57:53.847326 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-w7rjd" Mar 19 16:57:54 crc kubenswrapper[4918]: I0319 16:57:54.103988 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-svspw" Mar 19 16:57:54 crc kubenswrapper[4918]: I0319 16:57:54.188209 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-78877dc965-4vmdw" Mar 19 16:57:56 crc kubenswrapper[4918]: I0319 16:57:56.103686 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6c7d9f85c5-vtcc8" Mar 19 16:58:00 crc kubenswrapper[4918]: I0319 16:58:00.192209 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565658-jggqm"] Mar 19 16:58:00 crc kubenswrapper[4918]: I0319 16:58:00.193479 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565658-jggqm" Mar 19 16:58:00 crc kubenswrapper[4918]: I0319 16:58:00.196426 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 16:58:00 crc kubenswrapper[4918]: I0319 16:58:00.196677 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 16:58:00 crc kubenswrapper[4918]: I0319 16:58:00.196915 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 16:58:00 crc kubenswrapper[4918]: I0319 16:58:00.212111 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565658-jggqm"] Mar 19 16:58:00 crc kubenswrapper[4918]: I0319 16:58:00.280101 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtz5j\" (UniqueName: \"kubernetes.io/projected/c59e2850-c18c-4082-bb7c-22509bd97ec2-kube-api-access-qtz5j\") pod \"auto-csr-approver-29565658-jggqm\" (UID: \"c59e2850-c18c-4082-bb7c-22509bd97ec2\") " pod="openshift-infra/auto-csr-approver-29565658-jggqm" Mar 19 16:58:00 crc kubenswrapper[4918]: I0319 16:58:00.381684 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtz5j\" (UniqueName: \"kubernetes.io/projected/c59e2850-c18c-4082-bb7c-22509bd97ec2-kube-api-access-qtz5j\") pod \"auto-csr-approver-29565658-jggqm\" (UID: \"c59e2850-c18c-4082-bb7c-22509bd97ec2\") " pod="openshift-infra/auto-csr-approver-29565658-jggqm" Mar 19 16:58:00 crc kubenswrapper[4918]: I0319 16:58:00.412013 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtz5j\" (UniqueName: \"kubernetes.io/projected/c59e2850-c18c-4082-bb7c-22509bd97ec2-kube-api-access-qtz5j\") pod \"auto-csr-approver-29565658-jggqm\" (UID: \"c59e2850-c18c-4082-bb7c-22509bd97ec2\") " pod="openshift-infra/auto-csr-approver-29565658-jggqm" Mar 19 16:58:00 crc kubenswrapper[4918]: I0319 16:58:00.510611 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565658-jggqm" Mar 19 16:58:01 crc kubenswrapper[4918]: I0319 16:58:01.024961 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565658-jggqm"] Mar 19 16:58:01 crc kubenswrapper[4918]: W0319 16:58:01.037699 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc59e2850_c18c_4082_bb7c_22509bd97ec2.slice/crio-88dfcf7542fb3d4ffa4ff8463c10959c33f9955bfdcb4e97ba4c39b6f353652d WatchSource:0}: Error finding container 88dfcf7542fb3d4ffa4ff8463c10959c33f9955bfdcb4e97ba4c39b6f353652d: Status 404 returned error can't find the container with id 88dfcf7542fb3d4ffa4ff8463c10959c33f9955bfdcb4e97ba4c39b6f353652d Mar 19 16:58:01 crc kubenswrapper[4918]: I0319 16:58:01.040166 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 16:58:01 crc kubenswrapper[4918]: I0319 16:58:01.392260 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565658-jggqm" event={"ID":"c59e2850-c18c-4082-bb7c-22509bd97ec2","Type":"ContainerStarted","Data":"88dfcf7542fb3d4ffa4ff8463c10959c33f9955bfdcb4e97ba4c39b6f353652d"} Mar 19 16:58:06 crc kubenswrapper[4918]: I0319 16:58:06.452348 4918 generic.go:334] "Generic (PLEG): container finished" podID="c59e2850-c18c-4082-bb7c-22509bd97ec2" containerID="c0dcd069a9d5237045e8b71001f7a58ce8822b0ab47acff231f1095eab3c6168" exitCode=0 Mar 19 16:58:06 crc kubenswrapper[4918]: I0319 16:58:06.452439 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565658-jggqm" event={"ID":"c59e2850-c18c-4082-bb7c-22509bd97ec2","Type":"ContainerDied","Data":"c0dcd069a9d5237045e8b71001f7a58ce8822b0ab47acff231f1095eab3c6168"} Mar 19 16:58:07 crc kubenswrapper[4918]: I0319 16:58:07.802564 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565658-jggqm" Mar 19 16:58:07 crc kubenswrapper[4918]: I0319 16:58:07.899168 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtz5j\" (UniqueName: \"kubernetes.io/projected/c59e2850-c18c-4082-bb7c-22509bd97ec2-kube-api-access-qtz5j\") pod \"c59e2850-c18c-4082-bb7c-22509bd97ec2\" (UID: \"c59e2850-c18c-4082-bb7c-22509bd97ec2\") " Mar 19 16:58:07 crc kubenswrapper[4918]: I0319 16:58:07.909205 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c59e2850-c18c-4082-bb7c-22509bd97ec2-kube-api-access-qtz5j" (OuterVolumeSpecName: "kube-api-access-qtz5j") pod "c59e2850-c18c-4082-bb7c-22509bd97ec2" (UID: "c59e2850-c18c-4082-bb7c-22509bd97ec2"). InnerVolumeSpecName "kube-api-access-qtz5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:58:08 crc kubenswrapper[4918]: I0319 16:58:08.001345 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtz5j\" (UniqueName: \"kubernetes.io/projected/c59e2850-c18c-4082-bb7c-22509bd97ec2-kube-api-access-qtz5j\") on node \"crc\" DevicePath \"\"" Mar 19 16:58:08 crc kubenswrapper[4918]: I0319 16:58:08.473447 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565658-jggqm" event={"ID":"c59e2850-c18c-4082-bb7c-22509bd97ec2","Type":"ContainerDied","Data":"88dfcf7542fb3d4ffa4ff8463c10959c33f9955bfdcb4e97ba4c39b6f353652d"} Mar 19 16:58:08 crc kubenswrapper[4918]: I0319 16:58:08.473496 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88dfcf7542fb3d4ffa4ff8463c10959c33f9955bfdcb4e97ba4c39b6f353652d" Mar 19 16:58:08 crc kubenswrapper[4918]: I0319 16:58:08.473605 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565658-jggqm" Mar 19 16:58:08 crc kubenswrapper[4918]: I0319 16:58:08.874801 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565652-wcs4m"] Mar 19 16:58:08 crc kubenswrapper[4918]: I0319 16:58:08.880183 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565652-wcs4m"] Mar 19 16:58:10 crc kubenswrapper[4918]: I0319 16:58:10.601303 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31ec09cf-288d-4ac1-ab4a-9027d53ae433" path="/var/lib/kubelet/pods/31ec09cf-288d-4ac1-ab4a-9027d53ae433/volumes" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.335386 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4ngxj"] Mar 19 16:58:13 crc kubenswrapper[4918]: E0319 16:58:13.336103 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c59e2850-c18c-4082-bb7c-22509bd97ec2" containerName="oc" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.336121 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="c59e2850-c18c-4082-bb7c-22509bd97ec2" containerName="oc" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.336330 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="c59e2850-c18c-4082-bb7c-22509bd97ec2" containerName="oc" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.337194 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.341099 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.341415 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.341547 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.343647 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-qcwpf" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.350434 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4ngxj"] Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.385403 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vzfl\" (UniqueName: \"kubernetes.io/projected/b8c2a8ee-9d90-46c1-907e-78b27718ac68-kube-api-access-4vzfl\") pod \"dnsmasq-dns-675f4bcbfc-4ngxj\" (UID: \"b8c2a8ee-9d90-46c1-907e-78b27718ac68\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.385567 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8c2a8ee-9d90-46c1-907e-78b27718ac68-config\") pod \"dnsmasq-dns-675f4bcbfc-4ngxj\" (UID: \"b8c2a8ee-9d90-46c1-907e-78b27718ac68\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.402394 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9ktcs"] Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.403741 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.411302 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.413767 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9ktcs"] Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.486991 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8c2a8ee-9d90-46c1-907e-78b27718ac68-config\") pod \"dnsmasq-dns-675f4bcbfc-4ngxj\" (UID: \"b8c2a8ee-9d90-46c1-907e-78b27718ac68\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.487075 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9ktcs\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.487120 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vzfl\" (UniqueName: \"kubernetes.io/projected/b8c2a8ee-9d90-46c1-907e-78b27718ac68-kube-api-access-4vzfl\") pod \"dnsmasq-dns-675f4bcbfc-4ngxj\" (UID: \"b8c2a8ee-9d90-46c1-907e-78b27718ac68\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.487142 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-config\") pod \"dnsmasq-dns-78dd6ddcc-9ktcs\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.487174 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jbkr\" (UniqueName: \"kubernetes.io/projected/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-kube-api-access-4jbkr\") pod \"dnsmasq-dns-78dd6ddcc-9ktcs\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.488056 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8c2a8ee-9d90-46c1-907e-78b27718ac68-config\") pod \"dnsmasq-dns-675f4bcbfc-4ngxj\" (UID: \"b8c2a8ee-9d90-46c1-907e-78b27718ac68\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.507714 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vzfl\" (UniqueName: \"kubernetes.io/projected/b8c2a8ee-9d90-46c1-907e-78b27718ac68-kube-api-access-4vzfl\") pod \"dnsmasq-dns-675f4bcbfc-4ngxj\" (UID: \"b8c2a8ee-9d90-46c1-907e-78b27718ac68\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.588261 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jbkr\" (UniqueName: \"kubernetes.io/projected/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-kube-api-access-4jbkr\") pod \"dnsmasq-dns-78dd6ddcc-9ktcs\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.588360 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9ktcs\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.588394 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-config\") pod \"dnsmasq-dns-78dd6ddcc-9ktcs\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.589268 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-config\") pod \"dnsmasq-dns-78dd6ddcc-9ktcs\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.589351 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9ktcs\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.605943 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jbkr\" (UniqueName: \"kubernetes.io/projected/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-kube-api-access-4jbkr\") pod \"dnsmasq-dns-78dd6ddcc-9ktcs\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.657632 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" Mar 19 16:58:13 crc kubenswrapper[4918]: I0319 16:58:13.718780 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:14 crc kubenswrapper[4918]: I0319 16:58:14.092761 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4ngxj"] Mar 19 16:58:14 crc kubenswrapper[4918]: W0319 16:58:14.169259 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc83b87b2_2fa8_4cd5_8496_2e38f6ea9f92.slice/crio-3fc9791dfdf857c76c8f59024b325783d531f3a17ec87f60376e8eecc13bcdaf WatchSource:0}: Error finding container 3fc9791dfdf857c76c8f59024b325783d531f3a17ec87f60376e8eecc13bcdaf: Status 404 returned error can't find the container with id 3fc9791dfdf857c76c8f59024b325783d531f3a17ec87f60376e8eecc13bcdaf Mar 19 16:58:14 crc kubenswrapper[4918]: I0319 16:58:14.170812 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9ktcs"] Mar 19 16:58:14 crc kubenswrapper[4918]: I0319 16:58:14.534356 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" event={"ID":"b8c2a8ee-9d90-46c1-907e-78b27718ac68","Type":"ContainerStarted","Data":"67cfe2c5de2cdc3f9139d5204c5af607138037a9ace6bb2641b5cbe88e413a71"} Mar 19 16:58:14 crc kubenswrapper[4918]: I0319 16:58:14.536645 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" event={"ID":"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92","Type":"ContainerStarted","Data":"3fc9791dfdf857c76c8f59024b325783d531f3a17ec87f60376e8eecc13bcdaf"} Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.199160 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4ngxj"] Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.228397 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gvgwb"] Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.229597 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.244397 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gvgwb"] Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.330139 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-gvgwb\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.330222 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-config\") pod \"dnsmasq-dns-5ccc8479f9-gvgwb\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.330385 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z84ch\" (UniqueName: \"kubernetes.io/projected/91884a89-5ccc-40aa-953a-f1cef948a1f9-kube-api-access-z84ch\") pod \"dnsmasq-dns-5ccc8479f9-gvgwb\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.434257 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-gvgwb\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.434335 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-config\") pod \"dnsmasq-dns-5ccc8479f9-gvgwb\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.434376 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z84ch\" (UniqueName: \"kubernetes.io/projected/91884a89-5ccc-40aa-953a-f1cef948a1f9-kube-api-access-z84ch\") pod \"dnsmasq-dns-5ccc8479f9-gvgwb\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.435695 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-gvgwb\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.436180 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-config\") pod \"dnsmasq-dns-5ccc8479f9-gvgwb\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.500352 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z84ch\" (UniqueName: \"kubernetes.io/projected/91884a89-5ccc-40aa-953a-f1cef948a1f9-kube-api-access-z84ch\") pod \"dnsmasq-dns-5ccc8479f9-gvgwb\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.567450 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.578485 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9ktcs"] Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.601330 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvjg9"] Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.602392 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.609631 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvjg9"] Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.745798 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bvjg9\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.745843 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw72j\" (UniqueName: \"kubernetes.io/projected/0ee23ae1-641f-43be-a41f-2065671c4534-kube-api-access-fw72j\") pod \"dnsmasq-dns-57d769cc4f-bvjg9\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.745913 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-config\") pod \"dnsmasq-dns-57d769cc4f-bvjg9\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.847074 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-config\") pod \"dnsmasq-dns-57d769cc4f-bvjg9\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.847143 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bvjg9\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.847171 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw72j\" (UniqueName: \"kubernetes.io/projected/0ee23ae1-641f-43be-a41f-2065671c4534-kube-api-access-fw72j\") pod \"dnsmasq-dns-57d769cc4f-bvjg9\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.848298 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-config\") pod \"dnsmasq-dns-57d769cc4f-bvjg9\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.848840 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bvjg9\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.867036 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw72j\" (UniqueName: \"kubernetes.io/projected/0ee23ae1-641f-43be-a41f-2065671c4534-kube-api-access-fw72j\") pod \"dnsmasq-dns-57d769cc4f-bvjg9\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:16 crc kubenswrapper[4918]: I0319 16:58:16.981467 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.073848 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gvgwb"] Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.422239 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.424123 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.426591 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.426846 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.427295 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.427915 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.428287 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.428450 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-sc5lh" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.428666 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.433257 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvjg9"] Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.442988 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.564663 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.564729 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.564782 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.564813 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.564856 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/849ee593-de3d-4343-8a63-3ca581fbbaaf-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.564964 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm6q2\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-kube-api-access-mm6q2\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.565038 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.565088 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/849ee593-de3d-4343-8a63-3ca581fbbaaf-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.565117 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.565187 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.565272 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.577020 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" event={"ID":"91884a89-5ccc-40aa-953a-f1cef948a1f9","Type":"ContainerStarted","Data":"d8068de5ca57c772811db54781406410524709cd8c17fc21912325326be3336d"} Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.666440 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.666514 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.666590 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/849ee593-de3d-4343-8a63-3ca581fbbaaf-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.666651 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6q2\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-kube-api-access-mm6q2\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.666686 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.666717 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/849ee593-de3d-4343-8a63-3ca581fbbaaf-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.666744 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.666786 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.666811 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.666860 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.666887 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.670141 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.670184 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f0fb5bb69065b19e05ed9740b98b0aaf97ec80d366f84bfe0013a4f3714e457d/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.670995 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.671316 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.673229 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.673926 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.674725 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/849ee593-de3d-4343-8a63-3ca581fbbaaf-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.675462 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.687293 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.688634 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/849ee593-de3d-4343-8a63-3ca581fbbaaf-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.689958 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.691615 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm6q2\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-kube-api-access-mm6q2\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.704043 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\") pod \"rabbitmq-cell1-server-0\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.751665 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.752465 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.755598 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.758984 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.759190 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-xdp5h" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.768113 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.768584 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.768919 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.769178 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.769263 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.780807 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.871038 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-config-data\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.871094 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e42ce485-e6ce-4799-b932-b106c6280e82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.871179 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.871214 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/049bc86c-2172-4f37-b7b4-20e546c273e4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.871247 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/049bc86c-2172-4f37-b7b4-20e546c273e4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.871283 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.871314 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.871416 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.871661 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs65s\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-kube-api-access-gs65s\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.871759 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.871805 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.978279 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs65s\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-kube-api-access-gs65s\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.978596 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.978629 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.978657 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-config-data\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.978680 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e42ce485-e6ce-4799-b932-b106c6280e82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.978739 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.979404 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.981202 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-config-data\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.981320 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.981349 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e42ce485-e6ce-4799-b932-b106c6280e82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a3db0c9d310d3a68815aa4d626186578650c2baeddb5f523145e0eb7b8c277ef/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.981592 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/049bc86c-2172-4f37-b7b4-20e546c273e4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.981639 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/049bc86c-2172-4f37-b7b4-20e546c273e4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.981686 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.981745 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.981756 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.981867 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.982042 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.983222 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.983383 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.984669 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/049bc86c-2172-4f37-b7b4-20e546c273e4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.986435 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.994549 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/049bc86c-2172-4f37-b7b4-20e546c273e4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:17 crc kubenswrapper[4918]: I0319 16:58:17.998606 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs65s\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-kube-api-access-gs65s\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.013855 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e42ce485-e6ce-4799-b932-b106c6280e82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82\") pod \"rabbitmq-server-0\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " pod="openstack/rabbitmq-server-0" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.099563 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.886411 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.888601 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.890455 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4jx95" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.890629 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.894317 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.895622 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.896365 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.899978 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.995264 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22f16181-1900-453e-a97a-d3da7960a1cf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.995306 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5712ae32-c870-4267-a806-5860b9646ac8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5712ae32-c870-4267-a806-5860b9646ac8\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.995329 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22f16181-1900-453e-a97a-d3da7960a1cf-kolla-config\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.995384 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f16181-1900-453e-a97a-d3da7960a1cf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.995402 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn4sx\" (UniqueName: \"kubernetes.io/projected/22f16181-1900-453e-a97a-d3da7960a1cf-kube-api-access-nn4sx\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.995438 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22f16181-1900-453e-a97a-d3da7960a1cf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.995466 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22f16181-1900-453e-a97a-d3da7960a1cf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:18 crc kubenswrapper[4918]: I0319 16:58:18.995498 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22f16181-1900-453e-a97a-d3da7960a1cf-config-data-default\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.096415 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22f16181-1900-453e-a97a-d3da7960a1cf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.096480 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22f16181-1900-453e-a97a-d3da7960a1cf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.096594 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22f16181-1900-453e-a97a-d3da7960a1cf-config-data-default\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.096631 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22f16181-1900-453e-a97a-d3da7960a1cf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.096661 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5712ae32-c870-4267-a806-5860b9646ac8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5712ae32-c870-4267-a806-5860b9646ac8\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.096685 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22f16181-1900-453e-a97a-d3da7960a1cf-kolla-config\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.096751 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f16181-1900-453e-a97a-d3da7960a1cf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.096782 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn4sx\" (UniqueName: \"kubernetes.io/projected/22f16181-1900-453e-a97a-d3da7960a1cf-kube-api-access-nn4sx\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.097489 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/22f16181-1900-453e-a97a-d3da7960a1cf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.097503 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22f16181-1900-453e-a97a-d3da7960a1cf-kolla-config\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.097834 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/22f16181-1900-453e-a97a-d3da7960a1cf-config-data-default\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.098466 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22f16181-1900-453e-a97a-d3da7960a1cf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.110419 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/22f16181-1900-453e-a97a-d3da7960a1cf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.110900 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f16181-1900-453e-a97a-d3da7960a1cf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.111612 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn4sx\" (UniqueName: \"kubernetes.io/projected/22f16181-1900-453e-a97a-d3da7960a1cf-kube-api-access-nn4sx\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.112358 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.112386 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5712ae32-c870-4267-a806-5860b9646ac8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5712ae32-c870-4267-a806-5860b9646ac8\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b70af1edbdf4203d47ea5d1e7f0480cceaf4bb88639fcdf98d50043d04d2a649/globalmount\"" pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.159675 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5712ae32-c870-4267-a806-5860b9646ac8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5712ae32-c870-4267-a806-5860b9646ac8\") pod \"openstack-galera-0\" (UID: \"22f16181-1900-453e-a97a-d3da7960a1cf\") " pod="openstack/openstack-galera-0" Mar 19 16:58:19 crc kubenswrapper[4918]: I0319 16:58:19.224368 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.247081 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.249028 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.255367 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-969h8" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.255657 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.255674 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.257347 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.262375 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.318101 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a9da2c-83a7-408e-bae2-66a7097081ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.318271 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b889f3b6-02f4-4416-8345-3918fadf6937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b889f3b6-02f4-4416-8345-3918fadf6937\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.318340 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31a9da2c-83a7-408e-bae2-66a7097081ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.318401 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31a9da2c-83a7-408e-bae2-66a7097081ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.318432 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4jrq\" (UniqueName: \"kubernetes.io/projected/31a9da2c-83a7-408e-bae2-66a7097081ff-kube-api-access-f4jrq\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.318537 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31a9da2c-83a7-408e-bae2-66a7097081ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.318593 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31a9da2c-83a7-408e-bae2-66a7097081ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.318625 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a9da2c-83a7-408e-bae2-66a7097081ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.420576 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31a9da2c-83a7-408e-bae2-66a7097081ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.420638 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a9da2c-83a7-408e-bae2-66a7097081ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.420699 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a9da2c-83a7-408e-bae2-66a7097081ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.420779 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b889f3b6-02f4-4416-8345-3918fadf6937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b889f3b6-02f4-4416-8345-3918fadf6937\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.420822 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31a9da2c-83a7-408e-bae2-66a7097081ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.420864 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31a9da2c-83a7-408e-bae2-66a7097081ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.420885 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4jrq\" (UniqueName: \"kubernetes.io/projected/31a9da2c-83a7-408e-bae2-66a7097081ff-kube-api-access-f4jrq\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.420920 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31a9da2c-83a7-408e-bae2-66a7097081ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.422115 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31a9da2c-83a7-408e-bae2-66a7097081ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.422170 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a9da2c-83a7-408e-bae2-66a7097081ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.422986 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31a9da2c-83a7-408e-bae2-66a7097081ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.423076 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31a9da2c-83a7-408e-bae2-66a7097081ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.424139 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.424176 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b889f3b6-02f4-4416-8345-3918fadf6937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b889f3b6-02f4-4416-8345-3918fadf6937\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/30cd41ce09d076152c73ecf0f26ce5a8b47e3a9a1bfd0b63e1126f8ae8ea947d/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.432618 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a9da2c-83a7-408e-bae2-66a7097081ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.457292 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31a9da2c-83a7-408e-bae2-66a7097081ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.462244 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4jrq\" (UniqueName: \"kubernetes.io/projected/31a9da2c-83a7-408e-bae2-66a7097081ff-kube-api-access-f4jrq\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.479488 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b889f3b6-02f4-4416-8345-3918fadf6937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b889f3b6-02f4-4416-8345-3918fadf6937\") pod \"openstack-cell1-galera-0\" (UID: \"31a9da2c-83a7-408e-bae2-66a7097081ff\") " pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.587494 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.605001 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.606136 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.610585 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-76n6t" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.610697 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.611548 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.622544 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.724786 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0eab88c-d33a-4032-b2f7-f2a355157d81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.724843 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0eab88c-d33a-4032-b2f7-f2a355157d81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.724886 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0eab88c-d33a-4032-b2f7-f2a355157d81-kolla-config\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.724961 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0eab88c-d33a-4032-b2f7-f2a355157d81-config-data\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.725012 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvmm2\" (UniqueName: \"kubernetes.io/projected/d0eab88c-d33a-4032-b2f7-f2a355157d81-kube-api-access-fvmm2\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.826420 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0eab88c-d33a-4032-b2f7-f2a355157d81-config-data\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.826460 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmm2\" (UniqueName: \"kubernetes.io/projected/d0eab88c-d33a-4032-b2f7-f2a355157d81-kube-api-access-fvmm2\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.826511 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0eab88c-d33a-4032-b2f7-f2a355157d81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.826549 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0eab88c-d33a-4032-b2f7-f2a355157d81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.826575 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0eab88c-d33a-4032-b2f7-f2a355157d81-kolla-config\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.827439 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0eab88c-d33a-4032-b2f7-f2a355157d81-kolla-config\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.827963 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0eab88c-d33a-4032-b2f7-f2a355157d81-config-data\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.830498 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0eab88c-d33a-4032-b2f7-f2a355157d81-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.841723 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmm2\" (UniqueName: \"kubernetes.io/projected/d0eab88c-d33a-4032-b2f7-f2a355157d81-kube-api-access-fvmm2\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.845514 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0eab88c-d33a-4032-b2f7-f2a355157d81-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d0eab88c-d33a-4032-b2f7-f2a355157d81\") " pod="openstack/memcached-0" Mar 19 16:58:20 crc kubenswrapper[4918]: I0319 16:58:20.933475 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 16:58:21 crc kubenswrapper[4918]: W0319 16:58:21.686830 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ee23ae1_641f_43be_a41f_2065671c4534.slice/crio-6e0b98fde94bac296fa8100f01da7b9a82357f7efbb85fb6697f1cfc75c95ecd WatchSource:0}: Error finding container 6e0b98fde94bac296fa8100f01da7b9a82357f7efbb85fb6697f1cfc75c95ecd: Status 404 returned error can't find the container with id 6e0b98fde94bac296fa8100f01da7b9a82357f7efbb85fb6697f1cfc75c95ecd Mar 19 16:58:22 crc kubenswrapper[4918]: I0319 16:58:22.621090 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" event={"ID":"0ee23ae1-641f-43be-a41f-2065671c4534","Type":"ContainerStarted","Data":"6e0b98fde94bac296fa8100f01da7b9a82357f7efbb85fb6697f1cfc75c95ecd"} Mar 19 16:58:22 crc kubenswrapper[4918]: I0319 16:58:22.700972 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 16:58:22 crc kubenswrapper[4918]: I0319 16:58:22.701904 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 16:58:22 crc kubenswrapper[4918]: I0319 16:58:22.704454 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2c8c7" Mar 19 16:58:22 crc kubenswrapper[4918]: I0319 16:58:22.731621 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 16:58:22 crc kubenswrapper[4918]: I0319 16:58:22.762543 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sndc\" (UniqueName: \"kubernetes.io/projected/1818f96e-6152-49a9-b6fc-726d7677112c-kube-api-access-7sndc\") pod \"kube-state-metrics-0\" (UID: \"1818f96e-6152-49a9-b6fc-726d7677112c\") " pod="openstack/kube-state-metrics-0" Mar 19 16:58:22 crc kubenswrapper[4918]: I0319 16:58:22.867305 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sndc\" (UniqueName: \"kubernetes.io/projected/1818f96e-6152-49a9-b6fc-726d7677112c-kube-api-access-7sndc\") pod \"kube-state-metrics-0\" (UID: \"1818f96e-6152-49a9-b6fc-726d7677112c\") " pod="openstack/kube-state-metrics-0" Mar 19 16:58:22 crc kubenswrapper[4918]: I0319 16:58:22.914552 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sndc\" (UniqueName: \"kubernetes.io/projected/1818f96e-6152-49a9-b6fc-726d7677112c-kube-api-access-7sndc\") pod \"kube-state-metrics-0\" (UID: \"1818f96e-6152-49a9-b6fc-726d7677112c\") " pod="openstack/kube-state-metrics-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.019488 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.483141 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.486204 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.488842 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.489063 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.489650 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-6zjfl" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.489790 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.490814 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.508907 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.577489 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fd357519-ae6b-45ec-a8e1-dfc0c060be13-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.577668 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/fd357519-ae6b-45ec-a8e1-dfc0c060be13-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.577700 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m8jq\" (UniqueName: \"kubernetes.io/projected/fd357519-ae6b-45ec-a8e1-dfc0c060be13-kube-api-access-5m8jq\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.577733 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fd357519-ae6b-45ec-a8e1-dfc0c060be13-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.577754 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd357519-ae6b-45ec-a8e1-dfc0c060be13-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.577773 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd357519-ae6b-45ec-a8e1-dfc0c060be13-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.577790 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd357519-ae6b-45ec-a8e1-dfc0c060be13-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.679277 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/fd357519-ae6b-45ec-a8e1-dfc0c060be13-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.679340 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m8jq\" (UniqueName: \"kubernetes.io/projected/fd357519-ae6b-45ec-a8e1-dfc0c060be13-kube-api-access-5m8jq\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.679390 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fd357519-ae6b-45ec-a8e1-dfc0c060be13-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.679421 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd357519-ae6b-45ec-a8e1-dfc0c060be13-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.679448 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd357519-ae6b-45ec-a8e1-dfc0c060be13-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.679470 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd357519-ae6b-45ec-a8e1-dfc0c060be13-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.679507 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fd357519-ae6b-45ec-a8e1-dfc0c060be13-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.680967 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/fd357519-ae6b-45ec-a8e1-dfc0c060be13-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.685671 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd357519-ae6b-45ec-a8e1-dfc0c060be13-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.686135 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fd357519-ae6b-45ec-a8e1-dfc0c060be13-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.686830 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fd357519-ae6b-45ec-a8e1-dfc0c060be13-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.687222 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd357519-ae6b-45ec-a8e1-dfc0c060be13-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.697559 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd357519-ae6b-45ec-a8e1-dfc0c060be13-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.701897 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m8jq\" (UniqueName: \"kubernetes.io/projected/fd357519-ae6b-45ec-a8e1-dfc0c060be13-kube-api-access-5m8jq\") pod \"alertmanager-metric-storage-0\" (UID: \"fd357519-ae6b-45ec-a8e1-dfc0c060be13\") " pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:23 crc kubenswrapper[4918]: I0319 16:58:23.813016 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.568560 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.578247 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.580505 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.581378 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.581479 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.581497 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.581542 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.581545 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.581623 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.581706 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4lmwq" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.582089 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.740373 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.740438 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.740490 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.740505 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmjl\" (UniqueName: \"kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-kube-api-access-hhmjl\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.741070 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.741096 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.741134 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.741162 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.741192 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.741221 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.844635 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.844991 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.845036 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmjl\" (UniqueName: \"kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-kube-api-access-hhmjl\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.845054 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.845073 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.845115 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.845138 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.845193 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.845220 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.845301 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.846357 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.846990 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.848343 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.848538 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.850980 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.851284 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.851474 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.851505 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/532e7b4e9da3b88b0a1999a686ac8b131a30bc0c7a14e430eb07e3f1ae4f0fac/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.853254 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.853792 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.862743 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmjl\" (UniqueName: \"kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-kube-api-access-hhmjl\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.887509 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\") pod \"prometheus-metric-storage-0\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:24 crc kubenswrapper[4918]: I0319 16:58:24.905507 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.537501 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.539159 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.541174 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.541290 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.541400 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.542391 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-xwgk9" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.546957 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.563541 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.573350 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4g569"] Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.574616 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.577322 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-f66c8" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.577742 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.577873 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.630058 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4g569"] Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.630088 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-kt2zs"] Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.631566 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.651601 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kt2zs"] Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.676412 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e1232f6-b41e-443e-b96e-e38929f077d4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.676477 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfnpr\" (UniqueName: \"kubernetes.io/projected/ddfeeb53-dd69-430f-9460-fa20627d4d26-kube-api-access-kfnpr\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.676506 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-36c51053-c4d5-4f97-9672-566b48887280\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-36c51053-c4d5-4f97-9672-566b48887280\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.676562 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddfeeb53-dd69-430f-9460-fa20627d4d26-ovn-controller-tls-certs\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.676603 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ddfeeb53-dd69-430f-9460-fa20627d4d26-var-log-ovn\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.676637 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1232f6-b41e-443e-b96e-e38929f077d4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.676668 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1232f6-b41e-443e-b96e-e38929f077d4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.676711 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5e1232f6-b41e-443e-b96e-e38929f077d4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.676731 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddfeeb53-dd69-430f-9460-fa20627d4d26-var-run\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.676765 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhkr6\" (UniqueName: \"kubernetes.io/projected/5e1232f6-b41e-443e-b96e-e38929f077d4-kube-api-access-qhkr6\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.677050 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddfeeb53-dd69-430f-9460-fa20627d4d26-var-run-ovn\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.677135 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1232f6-b41e-443e-b96e-e38929f077d4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.677203 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfeeb53-dd69-430f-9460-fa20627d4d26-combined-ca-bundle\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.677232 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddfeeb53-dd69-430f-9460-fa20627d4d26-scripts\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.677289 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1232f6-b41e-443e-b96e-e38929f077d4-config\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779273 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfnpr\" (UniqueName: \"kubernetes.io/projected/ddfeeb53-dd69-430f-9460-fa20627d4d26-kube-api-access-kfnpr\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779316 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-36c51053-c4d5-4f97-9672-566b48887280\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-36c51053-c4d5-4f97-9672-566b48887280\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779339 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddfeeb53-dd69-430f-9460-fa20627d4d26-ovn-controller-tls-certs\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779372 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ddfeeb53-dd69-430f-9460-fa20627d4d26-var-log-ovn\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779401 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stgv6\" (UniqueName: \"kubernetes.io/projected/13525212-7d91-453f-a80d-2e6a8febb21e-kube-api-access-stgv6\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779429 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1232f6-b41e-443e-b96e-e38929f077d4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779453 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1232f6-b41e-443e-b96e-e38929f077d4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779495 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5e1232f6-b41e-443e-b96e-e38929f077d4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779517 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddfeeb53-dd69-430f-9460-fa20627d4d26-var-run\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779594 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-etc-ovs\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779623 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhkr6\" (UniqueName: \"kubernetes.io/projected/5e1232f6-b41e-443e-b96e-e38929f077d4-kube-api-access-qhkr6\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779654 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddfeeb53-dd69-430f-9460-fa20627d4d26-var-run-ovn\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779690 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1232f6-b41e-443e-b96e-e38929f077d4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779746 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfeeb53-dd69-430f-9460-fa20627d4d26-combined-ca-bundle\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779776 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddfeeb53-dd69-430f-9460-fa20627d4d26-scripts\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779809 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13525212-7d91-453f-a80d-2e6a8febb21e-scripts\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779862 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1232f6-b41e-443e-b96e-e38929f077d4-config\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779884 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-var-log\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779914 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-var-run\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779932 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e1232f6-b41e-443e-b96e-e38929f077d4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.779949 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-var-lib\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.780649 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5e1232f6-b41e-443e-b96e-e38929f077d4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.780659 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ddfeeb53-dd69-430f-9460-fa20627d4d26-var-log-ovn\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.780714 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddfeeb53-dd69-430f-9460-fa20627d4d26-var-run\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.780773 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddfeeb53-dd69-430f-9460-fa20627d4d26-var-run-ovn\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.781217 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1232f6-b41e-443e-b96e-e38929f077d4-config\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.781669 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e1232f6-b41e-443e-b96e-e38929f077d4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.782636 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddfeeb53-dd69-430f-9460-fa20627d4d26-scripts\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.784228 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.784260 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-36c51053-c4d5-4f97-9672-566b48887280\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-36c51053-c4d5-4f97-9672-566b48887280\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b5cb4eb5b50ea4f80b76061539e592ca8ce6d733a995ddae237890d6cf5b2887/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.785978 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddfeeb53-dd69-430f-9460-fa20627d4d26-ovn-controller-tls-certs\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.787026 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1232f6-b41e-443e-b96e-e38929f077d4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.788669 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1232f6-b41e-443e-b96e-e38929f077d4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.788976 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddfeeb53-dd69-430f-9460-fa20627d4d26-combined-ca-bundle\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.792215 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e1232f6-b41e-443e-b96e-e38929f077d4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.800237 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhkr6\" (UniqueName: \"kubernetes.io/projected/5e1232f6-b41e-443e-b96e-e38929f077d4-kube-api-access-qhkr6\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.802258 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfnpr\" (UniqueName: \"kubernetes.io/projected/ddfeeb53-dd69-430f-9460-fa20627d4d26-kube-api-access-kfnpr\") pod \"ovn-controller-4g569\" (UID: \"ddfeeb53-dd69-430f-9460-fa20627d4d26\") " pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.819432 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-36c51053-c4d5-4f97-9672-566b48887280\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-36c51053-c4d5-4f97-9672-566b48887280\") pod \"ovsdbserver-nb-0\" (UID: \"5e1232f6-b41e-443e-b96e-e38929f077d4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.865167 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.883354 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-var-log\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.883408 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-var-run\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.883429 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-var-lib\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.883478 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stgv6\" (UniqueName: \"kubernetes.io/projected/13525212-7d91-453f-a80d-2e6a8febb21e-kube-api-access-stgv6\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.883548 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-etc-ovs\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.883615 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13525212-7d91-453f-a80d-2e6a8febb21e-scripts\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.883679 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-var-log\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.883715 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-var-lib\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.883791 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-etc-ovs\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.883812 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13525212-7d91-453f-a80d-2e6a8febb21e-var-run\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.886503 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13525212-7d91-453f-a80d-2e6a8febb21e-scripts\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.901025 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4g569" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.907178 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stgv6\" (UniqueName: \"kubernetes.io/projected/13525212-7d91-453f-a80d-2e6a8febb21e-kube-api-access-stgv6\") pod \"ovn-controller-ovs-kt2zs\" (UID: \"13525212-7d91-453f-a80d-2e6a8febb21e\") " pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:26 crc kubenswrapper[4918]: I0319 16:58:26.951029 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:29 crc kubenswrapper[4918]: I0319 16:58:29.999386 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4"] Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.000740 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.004016 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.004027 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-8xdck" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.004099 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.004258 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.005832 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.021662 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4"] Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.138937 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/6c3e0b77-c556-4efa-91ba-b27926b39aa8-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.139010 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c3e0b77-c556-4efa-91ba-b27926b39aa8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.139043 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmqf8\" (UniqueName: \"kubernetes.io/projected/6c3e0b77-c556-4efa-91ba-b27926b39aa8-kube-api-access-nmqf8\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.139065 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c3e0b77-c556-4efa-91ba-b27926b39aa8-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.139093 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/6c3e0b77-c556-4efa-91ba-b27926b39aa8-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.182540 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn"] Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.183874 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.186015 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.192644 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.192831 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.203411 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn"] Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.245319 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/6c3e0b77-c556-4efa-91ba-b27926b39aa8-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.245449 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/6c3e0b77-c556-4efa-91ba-b27926b39aa8-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.245484 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c3e0b77-c556-4efa-91ba-b27926b39aa8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.245502 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmqf8\" (UniqueName: \"kubernetes.io/projected/6c3e0b77-c556-4efa-91ba-b27926b39aa8-kube-api-access-nmqf8\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.245537 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c3e0b77-c556-4efa-91ba-b27926b39aa8-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.246424 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c3e0b77-c556-4efa-91ba-b27926b39aa8-config\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.248136 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c3e0b77-c556-4efa-91ba-b27926b39aa8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.265177 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/6c3e0b77-c556-4efa-91ba-b27926b39aa8-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.266481 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/6c3e0b77-c556-4efa-91ba-b27926b39aa8-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.273912 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmqf8\" (UniqueName: \"kubernetes.io/projected/6c3e0b77-c556-4efa-91ba-b27926b39aa8-kube-api-access-nmqf8\") pod \"cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4\" (UID: \"6c3e0b77-c556-4efa-91ba-b27926b39aa8\") " pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.275575 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj"] Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.277304 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.283110 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.283366 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.306250 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj"] Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.324381 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.349547 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/35defcbc-2979-46e0-8f03-e1cc89f7fd86-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.350895 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlbld\" (UniqueName: \"kubernetes.io/projected/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-kube-api-access-qlbld\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.350941 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35defcbc-2979-46e0-8f03-e1cc89f7fd86-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.350964 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.350983 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.351015 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.351059 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.351091 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njm62\" (UniqueName: \"kubernetes.io/projected/35defcbc-2979-46e0-8f03-e1cc89f7fd86-kube-api-access-njm62\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.351109 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.351131 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/35defcbc-2979-46e0-8f03-e1cc89f7fd86-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.351150 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35defcbc-2979-46e0-8f03-e1cc89f7fd86-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.443006 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d"] Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.444027 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.452043 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.452314 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.452422 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.452629 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.452642 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.452756 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.454249 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njm62\" (UniqueName: \"kubernetes.io/projected/35defcbc-2979-46e0-8f03-e1cc89f7fd86-kube-api-access-njm62\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.454282 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.454312 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/35defcbc-2979-46e0-8f03-e1cc89f7fd86-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.454341 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35defcbc-2979-46e0-8f03-e1cc89f7fd86-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.454393 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/35defcbc-2979-46e0-8f03-e1cc89f7fd86-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.454420 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlbld\" (UniqueName: \"kubernetes.io/projected/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-kube-api-access-qlbld\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.454459 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35defcbc-2979-46e0-8f03-e1cc89f7fd86-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.454481 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.454496 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.454551 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.454601 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.456350 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-config\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.456406 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35defcbc-2979-46e0-8f03-e1cc89f7fd86-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.456694 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.456689 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35defcbc-2979-46e0-8f03-e1cc89f7fd86-config\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.463022 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.466689 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48"] Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.468315 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.477231 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.483285 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.484267 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/35defcbc-2979-46e0-8f03-e1cc89f7fd86-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.484746 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-vrl8k" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.488648 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d"] Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.489840 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njm62\" (UniqueName: \"kubernetes.io/projected/35defcbc-2979-46e0-8f03-e1cc89f7fd86-kube-api-access-njm62\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.490143 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlbld\" (UniqueName: \"kubernetes.io/projected/2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089-kube-api-access-qlbld\") pod \"cloudkitty-lokistack-querier-668f98fdd7-xzqfn\" (UID: \"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089\") " pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.491310 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/35defcbc-2979-46e0-8f03-e1cc89f7fd86-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-6f54889599-ljlbj\" (UID: \"35defcbc-2979-46e0-8f03-e1cc89f7fd86\") " pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.510154 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.532503 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48"] Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555340 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555388 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555437 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555465 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555486 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555504 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555536 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555572 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555650 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555680 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555708 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555735 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555815 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555852 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555877 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqvx6\" (UniqueName: \"kubernetes.io/projected/6e4b521b-2c5e-466f-8c30-881de9b09a1b-kube-api-access-hqvx6\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555901 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555928 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stxhv\" (UniqueName: \"kubernetes.io/projected/fdeccb80-0736-4fb2-b8e9-17a7317865cb-kube-api-access-stxhv\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.555965 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657156 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657235 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657263 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqvx6\" (UniqueName: \"kubernetes.io/projected/6e4b521b-2c5e-466f-8c30-881de9b09a1b-kube-api-access-hqvx6\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657293 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657318 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stxhv\" (UniqueName: \"kubernetes.io/projected/fdeccb80-0736-4fb2-b8e9-17a7317865cb-kube-api-access-stxhv\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: E0319 16:58:30.657349 4918 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Mar 19 16:58:30 crc kubenswrapper[4918]: E0319 16:58:30.657429 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-tls-secret podName:6e4b521b-2c5e-466f-8c30-881de9b09a1b nodeName:}" failed. No retries permitted until 2026-03-19 16:58:31.157409097 +0000 UTC m=+1123.279608345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-tls-secret") pod "cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" (UID: "6e4b521b-2c5e-466f-8c30-881de9b09a1b") : secret "cloudkitty-lokistack-gateway-http" not found Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657359 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657690 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657716 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657744 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657772 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657800 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657823 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657849 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.657934 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.658001 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.658032 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.658067 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.658107 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.658325 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: E0319 16:58:30.658499 4918 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Mar 19 16:58:30 crc kubenswrapper[4918]: E0319 16:58:30.658598 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-tls-secret podName:fdeccb80-0736-4fb2-b8e9-17a7317865cb nodeName:}" failed. No retries permitted until 2026-03-19 16:58:31.15857803 +0000 UTC m=+1123.280777278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-tls-secret") pod "cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" (UID: "fdeccb80-0736-4fb2-b8e9-17a7317865cb") : secret "cloudkitty-lokistack-gateway-http" not found Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.659084 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.659257 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.659883 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.660264 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.660487 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.661229 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.661921 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.663026 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.663167 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.663212 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.663210 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.664882 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.670052 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-tenants\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.670460 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6e4b521b-2c5e-466f-8c30-881de9b09a1b-rbac\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.686173 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stxhv\" (UniqueName: \"kubernetes.io/projected/fdeccb80-0736-4fb2-b8e9-17a7317865cb-kube-api-access-stxhv\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:30 crc kubenswrapper[4918]: I0319 16:58:30.687921 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqvx6\" (UniqueName: \"kubernetes.io/projected/6e4b521b-2c5e-466f-8c30-881de9b09a1b-kube-api-access-hqvx6\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.164127 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.165558 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.165654 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.165664 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.168281 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.168688 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.172724 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6e4b521b-2c5e-466f-8c30-881de9b09a1b-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-cmv48\" (UID: \"6e4b521b-2c5e-466f-8c30-881de9b09a1b\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.172928 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fdeccb80-0736-4fb2-b8e9-17a7317865cb-tls-secret\") pod \"cloudkitty-lokistack-gateway-6b884dc4b5-55g7d\" (UID: \"fdeccb80-0736-4fb2-b8e9-17a7317865cb\") " pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.181586 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.248693 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.250107 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.253322 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.253589 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.255093 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.269389 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142e9778-542e-491b-95f2-8a63e76c4271-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.269440 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.269491 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.269510 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.269580 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.269610 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f2r5\" (UniqueName: \"kubernetes.io/projected/142e9778-542e-491b-95f2-8a63e76c4271-kube-api-access-2f2r5\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.269627 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.269702 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.355732 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.357210 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.359670 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.359785 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.363944 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.370853 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plr42\" (UniqueName: \"kubernetes.io/projected/70a17e2e-15ff-4992-882c-b626fc8b94b6-kube-api-access-plr42\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.370901 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.370920 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.370955 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.370975 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.371003 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f2r5\" (UniqueName: \"kubernetes.io/projected/142e9778-542e-491b-95f2-8a63e76c4271-kube-api-access-2f2r5\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.371020 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.371058 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.373803 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.373811 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.374014 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.374455 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.374639 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142e9778-542e-491b-95f2-8a63e76c4271-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.374752 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.374854 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70a17e2e-15ff-4992-882c-b626fc8b94b6-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.374950 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.374659 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.375300 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.375959 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142e9778-542e-491b-95f2-8a63e76c4271-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.385750 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.386618 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.393221 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/142e9778-542e-491b-95f2-8a63e76c4271-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.400709 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f2r5\" (UniqueName: \"kubernetes.io/projected/142e9778-542e-491b-95f2-8a63e76c4271-kube-api-access-2f2r5\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.413324 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.424308 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"142e9778-542e-491b-95f2-8a63e76c4271\") " pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.455063 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.465786 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.476942 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plr42\" (UniqueName: \"kubernetes.io/projected/70a17e2e-15ff-4992-882c-b626fc8b94b6-kube-api-access-plr42\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.476997 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d2b0346-1ed3-4754-9788-e4f469a558e9-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477016 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477054 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477166 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477314 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh9fr\" (UniqueName: \"kubernetes.io/projected/2d2b0346-1ed3-4754-9788-e4f469a558e9-kube-api-access-dh9fr\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477392 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477501 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477573 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477625 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477685 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477712 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477760 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477818 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70a17e2e-15ff-4992-882c-b626fc8b94b6-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.477839 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.479918 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.480104 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70a17e2e-15ff-4992-882c-b626fc8b94b6-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.483591 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.483953 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.484893 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/70a17e2e-15ff-4992-882c-b626fc8b94b6-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.488786 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.504226 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plr42\" (UniqueName: \"kubernetes.io/projected/70a17e2e-15ff-4992-882c-b626fc8b94b6-kube-api-access-plr42\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.525681 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"70a17e2e-15ff-4992-882c-b626fc8b94b6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.579876 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.580229 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.580314 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d2b0346-1ed3-4754-9788-e4f469a558e9-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.580344 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.580397 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.580452 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh9fr\" (UniqueName: \"kubernetes.io/projected/2d2b0346-1ed3-4754-9788-e4f469a558e9-kube-api-access-dh9fr\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.580545 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.580704 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.587234 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.587457 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d2b0346-1ed3-4754-9788-e4f469a558e9-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.587827 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.588969 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.595682 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d2b0346-1ed3-4754-9788-e4f469a558e9-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.607572 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh9fr\" (UniqueName: \"kubernetes.io/projected/2d2b0346-1ed3-4754-9788-e4f469a558e9-kube-api-access-dh9fr\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.609007 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2d2b0346-1ed3-4754-9788-e4f469a558e9\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.660620 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.691661 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.693618 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.696693 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.696948 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-vkfpl" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.697056 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.697158 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.706222 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.773237 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.783677 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.783748 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.783779 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cb764fe3-ce1d-45f7-b854-755d567339dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cb764fe3-ce1d-45f7-b854-755d567339dc\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.783906 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.783954 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.783974 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.784318 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.784382 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s55tt\" (UniqueName: \"kubernetes.io/projected/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-kube-api-access-s55tt\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.886261 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.886327 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cb764fe3-ce1d-45f7-b854-755d567339dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cb764fe3-ce1d-45f7-b854-755d567339dc\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.886371 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.886436 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.886463 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.886533 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.886563 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s55tt\" (UniqueName: \"kubernetes.io/projected/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-kube-api-access-s55tt\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.886622 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.887478 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.887852 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.887965 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.891770 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.892386 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.892442 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.892438 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.892467 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cb764fe3-ce1d-45f7-b854-755d567339dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cb764fe3-ce1d-45f7-b854-755d567339dc\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9863dcb05424116be1df37b0acf24c9b2a3fd476b27b3636fa10da1d1a294191/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.913204 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s55tt\" (UniqueName: \"kubernetes.io/projected/fc6b3398-7f5f-4485-9826-fbb92f8f26e2-kube-api-access-s55tt\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:31 crc kubenswrapper[4918]: I0319 16:58:31.935705 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cb764fe3-ce1d-45f7-b854-755d567339dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cb764fe3-ce1d-45f7-b854-755d567339dc\") pod \"ovsdbserver-sb-0\" (UID: \"fc6b3398-7f5f-4485-9826-fbb92f8f26e2\") " pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:32 crc kubenswrapper[4918]: I0319 16:58:32.025372 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:32 crc kubenswrapper[4918]: E0319 16:58:32.403153 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 16:58:32 crc kubenswrapper[4918]: E0319 16:58:32.403320 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4vzfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-4ngxj_openstack(b8c2a8ee-9d90-46c1-907e-78b27718ac68): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 16:58:32 crc kubenswrapper[4918]: E0319 16:58:32.404597 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" podUID="b8c2a8ee-9d90-46c1-907e-78b27718ac68" Mar 19 16:58:32 crc kubenswrapper[4918]: E0319 16:58:32.605136 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 16:58:32 crc kubenswrapper[4918]: E0319 16:58:32.605434 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4jbkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-9ktcs_openstack(c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 16:58:32 crc kubenswrapper[4918]: E0319 16:58:32.606846 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" podUID="c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92" Mar 19 16:58:32 crc kubenswrapper[4918]: I0319 16:58:32.932978 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 16:58:32 crc kubenswrapper[4918]: I0319 16:58:32.985720 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 16:58:32 crc kubenswrapper[4918]: W0319 16:58:32.989447 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22f16181_1900_453e_a97a_d3da7960a1cf.slice/crio-2a9ae6b5ef63c29f8740c883ed5d4146d90904cfdd7ef7d38dfb2de4f899bb2a WatchSource:0}: Error finding container 2a9ae6b5ef63c29f8740c883ed5d4146d90904cfdd7ef7d38dfb2de4f899bb2a: Status 404 returned error can't find the container with id 2a9ae6b5ef63c29f8740c883ed5d4146d90904cfdd7ef7d38dfb2de4f899bb2a Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.017213 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.156768 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8c2a8ee-9d90-46c1-907e-78b27718ac68-config\") pod \"b8c2a8ee-9d90-46c1-907e-78b27718ac68\" (UID: \"b8c2a8ee-9d90-46c1-907e-78b27718ac68\") " Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.157111 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vzfl\" (UniqueName: \"kubernetes.io/projected/b8c2a8ee-9d90-46c1-907e-78b27718ac68-kube-api-access-4vzfl\") pod \"b8c2a8ee-9d90-46c1-907e-78b27718ac68\" (UID: \"b8c2a8ee-9d90-46c1-907e-78b27718ac68\") " Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.157444 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c2a8ee-9d90-46c1-907e-78b27718ac68-config" (OuterVolumeSpecName: "config") pod "b8c2a8ee-9d90-46c1-907e-78b27718ac68" (UID: "b8c2a8ee-9d90-46c1-907e-78b27718ac68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.157663 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8c2a8ee-9d90-46c1-907e-78b27718ac68-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.170414 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c2a8ee-9d90-46c1-907e-78b27718ac68-kube-api-access-4vzfl" (OuterVolumeSpecName: "kube-api-access-4vzfl") pod "b8c2a8ee-9d90-46c1-907e-78b27718ac68" (UID: "b8c2a8ee-9d90-46c1-907e-78b27718ac68"). InnerVolumeSpecName "kube-api-access-4vzfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.172540 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.259758 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vzfl\" (UniqueName: \"kubernetes.io/projected/b8c2a8ee-9d90-46c1-907e-78b27718ac68-kube-api-access-4vzfl\") on node \"crc\" DevicePath \"\"" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.490153 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.566975 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-dns-svc\") pod \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.567021 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jbkr\" (UniqueName: \"kubernetes.io/projected/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-kube-api-access-4jbkr\") pod \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.567181 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-config\") pod \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\" (UID: \"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92\") " Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.568161 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-config" (OuterVolumeSpecName: "config") pod "c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92" (UID: "c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.569028 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92" (UID: "c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.575177 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-kube-api-access-4jbkr" (OuterVolumeSpecName: "kube-api-access-4jbkr") pod "c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92" (UID: "c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92"). InnerVolumeSpecName "kube-api-access-4jbkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.669721 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.669980 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.669994 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jbkr\" (UniqueName: \"kubernetes.io/projected/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92-kube-api-access-4jbkr\") on node \"crc\" DevicePath \"\"" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.741764 4918 generic.go:334] "Generic (PLEG): container finished" podID="0ee23ae1-641f-43be-a41f-2065671c4534" containerID="ef48521b8e1c732fc9a73a10fbde0aee0c7690c02fd3dafc004151470c84103d" exitCode=0 Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.741840 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" event={"ID":"0ee23ae1-641f-43be-a41f-2065671c4534","Type":"ContainerDied","Data":"ef48521b8e1c732fc9a73a10fbde0aee0c7690c02fd3dafc004151470c84103d"} Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.745302 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.756568 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.764287 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4g569"] Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.774962 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.785287 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 16:58:33 crc kubenswrapper[4918]: W0319 16:58:33.796167 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd357519_ae6b_45ec_a8e1_dfc0c060be13.slice/crio-de6303734e230f36571382530579f99533db6f74a478496038a60443560be90c WatchSource:0}: Error finding container de6303734e230f36571382530579f99533db6f74a478496038a60443560be90c: Status 404 returned error can't find the container with id de6303734e230f36571382530579f99533db6f74a478496038a60443560be90c Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.799580 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"849ee593-de3d-4343-8a63-3ca581fbbaaf","Type":"ContainerStarted","Data":"01912381b0f1752f47c92a24234abee24d01d24cce9470553e9a3a83ea942d22"} Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.806400 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" event={"ID":"c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92","Type":"ContainerDied","Data":"3fc9791dfdf857c76c8f59024b325783d531f3a17ec87f60376e8eecc13bcdaf"} Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.806478 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9ktcs" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.808774 4918 generic.go:334] "Generic (PLEG): container finished" podID="91884a89-5ccc-40aa-953a-f1cef948a1f9" containerID="843bc0131bbd2decd984e20061830bbf9e39a023abc020a294645a5d8f12d166" exitCode=0 Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.808824 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" event={"ID":"91884a89-5ccc-40aa-953a-f1cef948a1f9","Type":"ContainerDied","Data":"843bc0131bbd2decd984e20061830bbf9e39a023abc020a294645a5d8f12d166"} Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.811707 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" event={"ID":"b8c2a8ee-9d90-46c1-907e-78b27718ac68","Type":"ContainerDied","Data":"67cfe2c5de2cdc3f9139d5204c5af607138037a9ace6bb2641b5cbe88e413a71"} Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.811788 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4ngxj" Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.821101 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"22f16181-1900-453e-a97a-d3da7960a1cf","Type":"ContainerStarted","Data":"2a9ae6b5ef63c29f8740c883ed5d4146d90904cfdd7ef7d38dfb2de4f899bb2a"} Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.822879 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"049bc86c-2172-4f37-b7b4-20e546c273e4","Type":"ContainerStarted","Data":"fba516ba3b8734de1a75f21837a3b329a1318b670440ca5f7bff899f1e78de3c"} Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.891261 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4ngxj"] Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.909189 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4ngxj"] Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.927597 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9ktcs"] Mar 19 16:58:33 crc kubenswrapper[4918]: I0319 16:58:33.931855 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9ktcs"] Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.023489 4918 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 19 16:58:34 crc kubenswrapper[4918]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/91884a89-5ccc-40aa-953a-f1cef948a1f9/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 16:58:34 crc kubenswrapper[4918]: > podSandboxID="d8068de5ca57c772811db54781406410524709cd8c17fc21912325326be3336d" Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.023642 4918 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 16:58:34 crc kubenswrapper[4918]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z84ch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-gvgwb_openstack(91884a89-5ccc-40aa-953a-f1cef948a1f9): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/91884a89-5ccc-40aa-953a-f1cef948a1f9/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 16:58:34 crc kubenswrapper[4918]: > logger="UnhandledError" Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.024690 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/91884a89-5ccc-40aa-953a-f1cef948a1f9/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" podUID="91884a89-5ccc-40aa-953a-f1cef948a1f9" Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.100721 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4"] Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.108148 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.115769 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48"] Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.123344 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Mar 19 16:58:34 crc kubenswrapper[4918]: W0319 16:58:34.124130 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c3e0b77_c556_4efa_91ba_b27926b39aa8.slice/crio-34c41772bd51cd472d49e9c771b9d58341d0ca4698bf9f55ee186c13d649144d WatchSource:0}: Error finding container 34c41772bd51cd472d49e9c771b9d58341d0ca4698bf9f55ee186c13d649144d: Status 404 returned error can't find the container with id 34c41772bd51cd472d49e9c771b9d58341d0ca4698bf9f55ee186c13d649144d Mar 19 16:58:34 crc kubenswrapper[4918]: W0319 16:58:34.124581 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e4b521b_2c5e_466f_8c30_881de9b09a1b.slice/crio-b8649622640fbadd0e038d81ace92e9d521ca6e0109064c712724d8a21354f59 WatchSource:0}: Error finding container b8649622640fbadd0e038d81ace92e9d521ca6e0109064c712724d8a21354f59: Status 404 returned error can't find the container with id b8649622640fbadd0e038d81ace92e9d521ca6e0109064c712724d8a21354f59 Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.147303 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d"] Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.159194 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj"] Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.278158 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.292986 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn"] Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.306181 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 16:58:34 crc kubenswrapper[4918]: W0319 16:58:34.308329 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d2b0346_1ed3_4754_9788_e4f469a558e9.slice/crio-d359b48e2aece3318143a1326f8bafb0610b46183f249b317bf73d6b23fec1fc WatchSource:0}: Error finding container d359b48e2aece3318143a1326f8bafb0610b46183f249b317bf73d6b23fec1fc: Status 404 returned error can't find the container with id d359b48e2aece3318143a1326f8bafb0610b46183f249b317bf73d6b23fec1fc Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.326993 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.340642 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-querier,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30,Command:[],Args:[-target=querier -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qlbld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-querier-668f98fdd7-xzqfn_openstack(2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.342470 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" podUID="2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089" Mar 19 16:58:34 crc kubenswrapper[4918]: W0319 16:58:34.350380 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e1232f6_b41e_443e_b96e_e38929f077d4.slice/crio-01265fea588dc49bb9edb56e30c1fba11bf66cb091cc102b3a4169a211ab6c6b WatchSource:0}: Error finding container 01265fea588dc49bb9edb56e30c1fba11bf66cb091cc102b3a4169a211ab6c6b: Status 404 returned error can't find the container with id 01265fea588dc49bb9edb56e30c1fba11bf66cb091cc102b3a4169a211ab6c6b Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.358171 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n646h565h5hdfhd4h65bh58bh5c6h564hc5h95h699h658h8bhbdh76hb4h56h5c5h5b9h5c7hc6h585h5bdh5fh5c8h5f8h5dh87h678h679hfcq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qhkr6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(5e1232f6-b41e-443e-b96e-e38929f077d4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.361280 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},EnvVar{Name:CONFIG_HASH,Value:n646h565h5hdfhd4h65bh58bh5c6h564hc5h95h699h658h8bhbdh76hb4h56h5c5h5b9h5c7hc6h585h5bdh5fh5c8h5f8h5dh87h678h679hfcq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qhkr6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(5e1232f6-b41e-443e-b96e-e38929f077d4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.362387 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack/ovsdbserver-nb-0" podUID="5e1232f6-b41e-443e-b96e-e38929f077d4" Mar 19 16:58:34 crc kubenswrapper[4918]: W0319 16:58:34.363478 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1818f96e_6152_49a9_b6fc_726d7677112c.slice/crio-deb37330da6ea86c05534aee9ad5c8a356cebcef93fed0788365e7a6f2d30a5a WatchSource:0}: Error finding container deb37330da6ea86c05534aee9ad5c8a356cebcef93fed0788365e7a6f2d30a5a: Status 404 returned error can't find the container with id deb37330da6ea86c05534aee9ad5c8a356cebcef93fed0788365e7a6f2d30a5a Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.366377 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7sndc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(1818f96e-6152-49a9-b6fc-726d7677112c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.367445 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/kube-state-metrics-0" podUID="1818f96e-6152-49a9-b6fc-726d7677112c" Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.606133 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c2a8ee-9d90-46c1-907e-78b27718ac68" path="/var/lib/kubelet/pods/b8c2a8ee-9d90-46c1-907e-78b27718ac68/volumes" Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.606998 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92" path="/var/lib/kubelet/pods/c83b87b2-2fa8-4cd5-8496-2e38f6ea9f92/volumes" Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.842691 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4g569" event={"ID":"ddfeeb53-dd69-430f-9460-fa20627d4d26","Type":"ContainerStarted","Data":"811d3f99be24cd699e65942d429a9eb4228556f5eb41eb0262f5cf5356e9448b"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.844360 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" event={"ID":"6e4b521b-2c5e-466f-8c30-881de9b09a1b","Type":"ContainerStarted","Data":"b8649622640fbadd0e038d81ace92e9d521ca6e0109064c712724d8a21354f59"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.846628 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" event={"ID":"0ee23ae1-641f-43be-a41f-2065671c4534","Type":"ContainerStarted","Data":"47ef84d4d6bb4ffb616f92bc828e1f72b86e6c18644fe2971ffde73668300bfb"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.846795 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.847944 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"142e9778-542e-491b-95f2-8a63e76c4271","Type":"ContainerStarted","Data":"1516b23347040cf10424026c61db8a1651db87d77e6bae8625ce024624270678"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.849697 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"70a17e2e-15ff-4992-882c-b626fc8b94b6","Type":"ContainerStarted","Data":"90d1d49166702a59c5bc504067504b36d7fa56e30e8cb9b607e7a43f715f88b6"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.852131 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" event={"ID":"fdeccb80-0736-4fb2-b8e9-17a7317865cb","Type":"ContainerStarted","Data":"e9577a1649c4e56d41928b8a638305be8545f6b9795a256ceb30977510b75a06"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.853914 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" event={"ID":"6c3e0b77-c556-4efa-91ba-b27926b39aa8","Type":"ContainerStarted","Data":"34c41772bd51cd472d49e9c771b9d58341d0ca4698bf9f55ee186c13d649144d"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.854913 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" event={"ID":"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089","Type":"ContainerStarted","Data":"1a9dfac11b405a6ca1dcf92914267572e696cec14bbe1ca544d4c269ba1ffa46"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.857048 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" event={"ID":"35defcbc-2979-46e0-8f03-e1cc89f7fd86","Type":"ContainerStarted","Data":"233e0b600c360a1adb752dc690109fd3f651c1dceb2c17cde6a16697110f02e7"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.858161 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"2d2b0346-1ed3-4754-9788-e4f469a558e9","Type":"ContainerStarted","Data":"d359b48e2aece3318143a1326f8bafb0610b46183f249b317bf73d6b23fec1fc"} Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.858763 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" podUID="2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089" Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.861656 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1818f96e-6152-49a9-b6fc-726d7677112c","Type":"ContainerStarted","Data":"deb37330da6ea86c05534aee9ad5c8a356cebcef93fed0788365e7a6f2d30a5a"} Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.863804 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="1818f96e-6152-49a9-b6fc-726d7677112c" Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.864377 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5e1232f6-b41e-443e-b96e-e38929f077d4","Type":"ContainerStarted","Data":"01265fea588dc49bb9edb56e30c1fba11bf66cb091cc102b3a4169a211ab6c6b"} Mar 19 16:58:34 crc kubenswrapper[4918]: E0319 16:58:34.865970 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"]" pod="openstack/ovsdbserver-nb-0" podUID="5e1232f6-b41e-443e-b96e-e38929f077d4" Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.866362 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" podStartSLOduration=7.6420043589999995 podStartE2EDuration="18.866325377s" podCreationTimestamp="2026-03-19 16:58:16 +0000 UTC" firstStartedPulling="2026-03-19 16:58:21.690016521 +0000 UTC m=+1113.812215779" lastFinishedPulling="2026-03-19 16:58:32.914337549 +0000 UTC m=+1125.036536797" observedRunningTime="2026-03-19 16:58:34.864560247 +0000 UTC m=+1126.986759495" watchObservedRunningTime="2026-03-19 16:58:34.866325377 +0000 UTC m=+1126.988524625" Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.868141 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"fd357519-ae6b-45ec-a8e1-dfc0c060be13","Type":"ContainerStarted","Data":"de6303734e230f36571382530579f99533db6f74a478496038a60443560be90c"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.869358 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d0eab88c-d33a-4032-b2f7-f2a355157d81","Type":"ContainerStarted","Data":"fa24f2ccd871bd03436875f633f41445b8046907f4f6c14c1a9b4848a9d5b7d3"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.871294 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"08c86067-0c7f-47a2-a2d4-e29ad43c539f","Type":"ContainerStarted","Data":"c10d05b33949e842790ec76aaadb87bcce31a152a1e46cf78cbd45604476deaa"} Mar 19 16:58:34 crc kubenswrapper[4918]: I0319 16:58:34.873468 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31a9da2c-83a7-408e-bae2-66a7097081ff","Type":"ContainerStarted","Data":"fe68561d31bd0225cc9b4ae2a7a8be73b02cfb5d64c0268d10c17123a46ba873"} Mar 19 16:58:35 crc kubenswrapper[4918]: I0319 16:58:35.075479 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 16:58:35 crc kubenswrapper[4918]: I0319 16:58:35.380386 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kt2zs"] Mar 19 16:58:35 crc kubenswrapper[4918]: E0319 16:58:35.883580 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:40a6ef5d62dd6bcd82f3a965d0e00bb5f500b88724f9bc3b06103f1402543b30\\\"\"" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" podUID="2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089" Mar 19 16:58:35 crc kubenswrapper[4918]: E0319 16:58:35.884088 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="1818f96e-6152-49a9-b6fc-726d7677112c" Mar 19 16:58:35 crc kubenswrapper[4918]: E0319 16:58:35.884634 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"]" pod="openstack/ovsdbserver-nb-0" podUID="5e1232f6-b41e-443e-b96e-e38929f077d4" Mar 19 16:58:36 crc kubenswrapper[4918]: I0319 16:58:36.924820 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" event={"ID":"91884a89-5ccc-40aa-953a-f1cef948a1f9","Type":"ContainerStarted","Data":"b7ea86a9db80002084ec6b0d52798ca7a34b3beaaf8804c5766aaa1417ac0a95"} Mar 19 16:58:36 crc kubenswrapper[4918]: I0319 16:58:36.926291 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:36 crc kubenswrapper[4918]: I0319 16:58:36.942539 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" podStartSLOduration=5.214086643 podStartE2EDuration="20.942507043s" podCreationTimestamp="2026-03-19 16:58:16 +0000 UTC" firstStartedPulling="2026-03-19 16:58:17.084375013 +0000 UTC m=+1109.206574261" lastFinishedPulling="2026-03-19 16:58:32.812795413 +0000 UTC m=+1124.934994661" observedRunningTime="2026-03-19 16:58:36.94098759 +0000 UTC m=+1129.063186838" watchObservedRunningTime="2026-03-19 16:58:36.942507043 +0000 UTC m=+1129.064706291" Mar 19 16:58:39 crc kubenswrapper[4918]: I0319 16:58:39.953835 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fc6b3398-7f5f-4485-9826-fbb92f8f26e2","Type":"ContainerStarted","Data":"6e1da711011513befec3656b92a2e4b07061e2a97d73bfe77a6714c4471530c0"} Mar 19 16:58:41 crc kubenswrapper[4918]: W0319 16:58:41.373316 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13525212_7d91_453f_a80d_2e6a8febb21e.slice/crio-de2463c4748f3a4709f1fd15d60884fe2135af3c24bed89b9e6e83f231344262 WatchSource:0}: Error finding container de2463c4748f3a4709f1fd15d60884fe2135af3c24bed89b9e6e83f231344262: Status 404 returned error can't find the container with id de2463c4748f3a4709f1fd15d60884fe2135af3c24bed89b9e6e83f231344262 Mar 19 16:58:41 crc kubenswrapper[4918]: I0319 16:58:41.572543 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:41 crc kubenswrapper[4918]: I0319 16:58:41.972512 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kt2zs" event={"ID":"13525212-7d91-453f-a80d-2e6a8febb21e","Type":"ContainerStarted","Data":"de2463c4748f3a4709f1fd15d60884fe2135af3c24bed89b9e6e83f231344262"} Mar 19 16:58:41 crc kubenswrapper[4918]: I0319 16:58:41.983718 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:58:42 crc kubenswrapper[4918]: I0319 16:58:42.036875 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gvgwb"] Mar 19 16:58:42 crc kubenswrapper[4918]: I0319 16:58:42.037073 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" podUID="91884a89-5ccc-40aa-953a-f1cef948a1f9" containerName="dnsmasq-dns" containerID="cri-o://b7ea86a9db80002084ec6b0d52798ca7a34b3beaaf8804c5766aaa1417ac0a95" gracePeriod=10 Mar 19 16:58:42 crc kubenswrapper[4918]: I0319 16:58:42.980364 4918 generic.go:334] "Generic (PLEG): container finished" podID="91884a89-5ccc-40aa-953a-f1cef948a1f9" containerID="b7ea86a9db80002084ec6b0d52798ca7a34b3beaaf8804c5766aaa1417ac0a95" exitCode=0 Mar 19 16:58:42 crc kubenswrapper[4918]: I0319 16:58:42.980416 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" event={"ID":"91884a89-5ccc-40aa-953a-f1cef948a1f9","Type":"ContainerDied","Data":"b7ea86a9db80002084ec6b0d52798ca7a34b3beaaf8804c5766aaa1417ac0a95"} Mar 19 16:58:46 crc kubenswrapper[4918]: I0319 16:58:46.570412 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" podUID="91884a89-5ccc-40aa-953a-f1cef948a1f9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.108:5353: connect: connection refused" Mar 19 16:58:48 crc kubenswrapper[4918]: I0319 16:58:48.531195 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:48 crc kubenswrapper[4918]: I0319 16:58:48.667403 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z84ch\" (UniqueName: \"kubernetes.io/projected/91884a89-5ccc-40aa-953a-f1cef948a1f9-kube-api-access-z84ch\") pod \"91884a89-5ccc-40aa-953a-f1cef948a1f9\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " Mar 19 16:58:48 crc kubenswrapper[4918]: I0319 16:58:48.667455 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-dns-svc\") pod \"91884a89-5ccc-40aa-953a-f1cef948a1f9\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " Mar 19 16:58:48 crc kubenswrapper[4918]: I0319 16:58:48.667492 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-config\") pod \"91884a89-5ccc-40aa-953a-f1cef948a1f9\" (UID: \"91884a89-5ccc-40aa-953a-f1cef948a1f9\") " Mar 19 16:58:48 crc kubenswrapper[4918]: I0319 16:58:48.672298 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91884a89-5ccc-40aa-953a-f1cef948a1f9-kube-api-access-z84ch" (OuterVolumeSpecName: "kube-api-access-z84ch") pod "91884a89-5ccc-40aa-953a-f1cef948a1f9" (UID: "91884a89-5ccc-40aa-953a-f1cef948a1f9"). InnerVolumeSpecName "kube-api-access-z84ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:58:48 crc kubenswrapper[4918]: I0319 16:58:48.701289 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-config" (OuterVolumeSpecName: "config") pod "91884a89-5ccc-40aa-953a-f1cef948a1f9" (UID: "91884a89-5ccc-40aa-953a-f1cef948a1f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:58:48 crc kubenswrapper[4918]: I0319 16:58:48.702948 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91884a89-5ccc-40aa-953a-f1cef948a1f9" (UID: "91884a89-5ccc-40aa-953a-f1cef948a1f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:58:48 crc kubenswrapper[4918]: I0319 16:58:48.770803 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z84ch\" (UniqueName: \"kubernetes.io/projected/91884a89-5ccc-40aa-953a-f1cef948a1f9-kube-api-access-z84ch\") on node \"crc\" DevicePath \"\"" Mar 19 16:58:48 crc kubenswrapper[4918]: I0319 16:58:48.771225 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 16:58:48 crc kubenswrapper[4918]: I0319 16:58:48.771255 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91884a89-5ccc-40aa-953a-f1cef948a1f9-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:58:49 crc kubenswrapper[4918]: I0319 16:58:49.056425 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" event={"ID":"91884a89-5ccc-40aa-953a-f1cef948a1f9","Type":"ContainerDied","Data":"d8068de5ca57c772811db54781406410524709cd8c17fc21912325326be3336d"} Mar 19 16:58:49 crc kubenswrapper[4918]: I0319 16:58:49.056560 4918 scope.go:117] "RemoveContainer" containerID="b7ea86a9db80002084ec6b0d52798ca7a34b3beaaf8804c5766aaa1417ac0a95" Mar 19 16:58:49 crc kubenswrapper[4918]: I0319 16:58:49.056941 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-gvgwb" Mar 19 16:58:49 crc kubenswrapper[4918]: I0319 16:58:49.087777 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gvgwb"] Mar 19 16:58:49 crc kubenswrapper[4918]: I0319 16:58:49.093272 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-gvgwb"] Mar 19 16:58:49 crc kubenswrapper[4918]: I0319 16:58:49.285098 4918 scope.go:117] "RemoveContainer" containerID="843bc0131bbd2decd984e20061830bbf9e39a023abc020a294645a5d8f12d166" Mar 19 16:58:50 crc kubenswrapper[4918]: I0319 16:58:50.065596 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"22f16181-1900-453e-a97a-d3da7960a1cf","Type":"ContainerStarted","Data":"cf7c3ba42f63f4fe602cb18c525732e5cc1382335080009c548cf1363434e56b"} Mar 19 16:58:50 crc kubenswrapper[4918]: I0319 16:58:50.067399 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"70a17e2e-15ff-4992-882c-b626fc8b94b6","Type":"ContainerStarted","Data":"222975b3f30aae50e82475481cc80759438131d60ff2cae05878777f973d1570"} Mar 19 16:58:50 crc kubenswrapper[4918]: I0319 16:58:50.067537 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:58:50 crc kubenswrapper[4918]: I0319 16:58:50.070242 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d0eab88c-d33a-4032-b2f7-f2a355157d81","Type":"ContainerStarted","Data":"8d061a3ef6173911dd3124e4a30e830b1f94cc10e18275e87bbe7fb1019e4aeb"} Mar 19 16:58:50 crc kubenswrapper[4918]: I0319 16:58:50.070361 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 19 16:58:50 crc kubenswrapper[4918]: I0319 16:58:50.105237 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=5.445252311 podStartE2EDuration="20.105218813s" podCreationTimestamp="2026-03-19 16:58:30 +0000 UTC" firstStartedPulling="2026-03-19 16:58:34.148229781 +0000 UTC m=+1126.270429019" lastFinishedPulling="2026-03-19 16:58:48.808196273 +0000 UTC m=+1140.930395521" observedRunningTime="2026-03-19 16:58:50.102343253 +0000 UTC m=+1142.224542521" watchObservedRunningTime="2026-03-19 16:58:50.105218813 +0000 UTC m=+1142.227418061" Mar 19 16:58:50 crc kubenswrapper[4918]: I0319 16:58:50.124551 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.811314626 podStartE2EDuration="30.124532944s" podCreationTimestamp="2026-03-19 16:58:20 +0000 UTC" firstStartedPulling="2026-03-19 16:58:33.789289791 +0000 UTC m=+1125.911489039" lastFinishedPulling="2026-03-19 16:58:48.102508109 +0000 UTC m=+1140.224707357" observedRunningTime="2026-03-19 16:58:50.121176722 +0000 UTC m=+1142.243375990" watchObservedRunningTime="2026-03-19 16:58:50.124532944 +0000 UTC m=+1142.246732192" Mar 19 16:58:50 crc kubenswrapper[4918]: I0319 16:58:50.599483 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91884a89-5ccc-40aa-953a-f1cef948a1f9" path="/var/lib/kubelet/pods/91884a89-5ccc-40aa-953a-f1cef948a1f9/volumes" Mar 19 16:58:50 crc kubenswrapper[4918]: I0319 16:58:50.811406 4918 scope.go:117] "RemoveContainer" containerID="88237b91668b89f0a6d53af7b3d7c0223f45c777225b06d07a111ac2339f6426" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.087571 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4g569" event={"ID":"ddfeeb53-dd69-430f-9460-fa20627d4d26","Type":"ContainerStarted","Data":"47588a54bf351746bacd0669491262aeea71f2915348945f1df0eb91d6616f1b"} Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.087954 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-4g569" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.090688 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"2d2b0346-1ed3-4754-9788-e4f469a558e9","Type":"ContainerStarted","Data":"8490453375e6e6c5a28d79cc38de8ca28af52b57509f8e4a4ddd06aab0782ac2"} Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.090898 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.108221 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4g569" podStartSLOduration=10.470355916 podStartE2EDuration="25.10820601s" podCreationTimestamp="2026-03-19 16:58:26 +0000 UTC" firstStartedPulling="2026-03-19 16:58:33.801508307 +0000 UTC m=+1125.923707555" lastFinishedPulling="2026-03-19 16:58:48.439358411 +0000 UTC m=+1140.561557649" observedRunningTime="2026-03-19 16:58:51.107981974 +0000 UTC m=+1143.230181232" watchObservedRunningTime="2026-03-19 16:58:51.10820601 +0000 UTC m=+1143.230405258" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.114917 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" event={"ID":"fdeccb80-0736-4fb2-b8e9-17a7317865cb","Type":"ContainerStarted","Data":"5ce0b6181daabefce3ddf4f6d503554198f861912f6e245b7b4e3178f3a2785e"} Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.117362 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.120815 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"142e9778-542e-491b-95f2-8a63e76c4271","Type":"ContainerStarted","Data":"afd287178426a13a2870cca43ff0882b2937b13882363029b35d9028cf9b9778"} Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.121561 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.126040 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" event={"ID":"35defcbc-2979-46e0-8f03-e1cc89f7fd86","Type":"ContainerStarted","Data":"492cc30dc8a038ee3bc9694a31333fefdfcc8174a1747c472ea98f3bbbc83d67"} Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.126200 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.131726 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=6.947874831 podStartE2EDuration="21.131707417s" podCreationTimestamp="2026-03-19 16:58:30 +0000 UTC" firstStartedPulling="2026-03-19 16:58:34.312444301 +0000 UTC m=+1126.434643549" lastFinishedPulling="2026-03-19 16:58:48.496276877 +0000 UTC m=+1140.618476135" observedRunningTime="2026-03-19 16:58:51.129856465 +0000 UTC m=+1143.252055713" watchObservedRunningTime="2026-03-19 16:58:51.131707417 +0000 UTC m=+1143.253906665" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.143989 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31a9da2c-83a7-408e-bae2-66a7097081ff","Type":"ContainerStarted","Data":"f69a576d43a003f077a373b93182d4866b7fe77c45cc0f98f71e8c237fcd6f59"} Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.147383 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.149504 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" event={"ID":"6c3e0b77-c556-4efa-91ba-b27926b39aa8","Type":"ContainerStarted","Data":"554abf89c41937cc2b21bd0edcc6bcafa543a7aa9e925f6c2c181ef580d07d92"} Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.149731 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.167140 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"849ee593-de3d-4343-8a63-3ca581fbbaaf","Type":"ContainerStarted","Data":"8dcfdc5ec03c9e57cd90b1af5c63e724835e77f75667c64ea36e6d4de0de6025"} Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.170912 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" event={"ID":"6e4b521b-2c5e-466f-8c30-881de9b09a1b","Type":"ContainerStarted","Data":"184ed48a9e6df858da394802520f8e0ac94f4c4cb882c84bf093ae448e673a03"} Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.171600 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.173831 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=6.544665831 podStartE2EDuration="21.173817816s" podCreationTimestamp="2026-03-19 16:58:30 +0000 UTC" firstStartedPulling="2026-03-19 16:58:34.172862318 +0000 UTC m=+1126.295061566" lastFinishedPulling="2026-03-19 16:58:48.802014313 +0000 UTC m=+1140.924213551" observedRunningTime="2026-03-19 16:58:51.149589918 +0000 UTC m=+1143.271789166" watchObservedRunningTime="2026-03-19 16:58:51.173817816 +0000 UTC m=+1143.296017064" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.174938 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"049bc86c-2172-4f37-b7b4-20e546c273e4","Type":"ContainerStarted","Data":"3def5dec798f2b559b2334d375c417119107e526427d467805fc3b41126d7aca"} Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.177383 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kt2zs" event={"ID":"13525212-7d91-453f-a80d-2e6a8febb21e","Type":"ContainerStarted","Data":"55d0251ffd0ff38fcc9ff04d936a493caced314a8dc2053831edf260a31836ec"} Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.183863 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" podStartSLOduration=6.581755933 podStartE2EDuration="21.183838122s" podCreationTimestamp="2026-03-19 16:58:30 +0000 UTC" firstStartedPulling="2026-03-19 16:58:34.169302511 +0000 UTC m=+1126.291501759" lastFinishedPulling="2026-03-19 16:58:48.77138469 +0000 UTC m=+1140.893583948" observedRunningTime="2026-03-19 16:58:51.17943392 +0000 UTC m=+1143.301633168" watchObservedRunningTime="2026-03-19 16:58:51.183838122 +0000 UTC m=+1143.306037370" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.216656 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-55g7d" podStartSLOduration=6.585383193 podStartE2EDuration="21.216636574s" podCreationTimestamp="2026-03-19 16:58:30 +0000 UTC" firstStartedPulling="2026-03-19 16:58:34.15399432 +0000 UTC m=+1126.276193568" lastFinishedPulling="2026-03-19 16:58:48.785247701 +0000 UTC m=+1140.907446949" observedRunningTime="2026-03-19 16:58:51.214034692 +0000 UTC m=+1143.336233940" watchObservedRunningTime="2026-03-19 16:58:51.216636574 +0000 UTC m=+1143.338835822" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.223594 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.265315 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" podStartSLOduration=7.612400516 podStartE2EDuration="22.265296593s" podCreationTimestamp="2026-03-19 16:58:29 +0000 UTC" firstStartedPulling="2026-03-19 16:58:34.131153651 +0000 UTC m=+1126.253352899" lastFinishedPulling="2026-03-19 16:58:48.784049708 +0000 UTC m=+1140.906248976" observedRunningTime="2026-03-19 16:58:51.23791033 +0000 UTC m=+1143.360109598" watchObservedRunningTime="2026-03-19 16:58:51.265296593 +0000 UTC m=+1143.387495841" Mar 19 16:58:51 crc kubenswrapper[4918]: I0319 16:58:51.409033 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-6b884dc4b5-cmv48" podStartSLOduration=6.997007974 podStartE2EDuration="21.40901431s" podCreationTimestamp="2026-03-19 16:58:30 +0000 UTC" firstStartedPulling="2026-03-19 16:58:34.148633372 +0000 UTC m=+1126.270832620" lastFinishedPulling="2026-03-19 16:58:48.560639718 +0000 UTC m=+1140.682838956" observedRunningTime="2026-03-19 16:58:51.387177008 +0000 UTC m=+1143.509376256" watchObservedRunningTime="2026-03-19 16:58:51.40901431 +0000 UTC m=+1143.531213558" Mar 19 16:58:52 crc kubenswrapper[4918]: I0319 16:58:52.197330 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"08c86067-0c7f-47a2-a2d4-e29ad43c539f","Type":"ContainerStarted","Data":"c370355f645a1c2288afd62e6eb7c991dca804e2751677e640a89aed254fedea"} Mar 19 16:58:52 crc kubenswrapper[4918]: I0319 16:58:52.201386 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fc6b3398-7f5f-4485-9826-fbb92f8f26e2","Type":"ContainerStarted","Data":"017dd0848b2df9f07ed374e13faf917e49260cec8d3f463c2ce64405efda6896"} Mar 19 16:58:52 crc kubenswrapper[4918]: I0319 16:58:52.206907 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5e1232f6-b41e-443e-b96e-e38929f077d4","Type":"ContainerStarted","Data":"de1b0ec4f6041a3bece53d0a8d81d520682f3c5ab6ae7d79ffbc677e0a011916"} Mar 19 16:58:52 crc kubenswrapper[4918]: I0319 16:58:52.208530 4918 generic.go:334] "Generic (PLEG): container finished" podID="13525212-7d91-453f-a80d-2e6a8febb21e" containerID="55d0251ffd0ff38fcc9ff04d936a493caced314a8dc2053831edf260a31836ec" exitCode=0 Mar 19 16:58:52 crc kubenswrapper[4918]: I0319 16:58:52.208600 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kt2zs" event={"ID":"13525212-7d91-453f-a80d-2e6a8febb21e","Type":"ContainerDied","Data":"55d0251ffd0ff38fcc9ff04d936a493caced314a8dc2053831edf260a31836ec"} Mar 19 16:58:52 crc kubenswrapper[4918]: I0319 16:58:52.214420 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" event={"ID":"2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089","Type":"ContainerStarted","Data":"2b581c4904ccdebe992f20d916eab7f57f04ac764fce0eba232075111a7fc7c1"} Mar 19 16:58:52 crc kubenswrapper[4918]: I0319 16:58:52.214905 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:58:52 crc kubenswrapper[4918]: I0319 16:58:52.273504 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" podStartSLOduration=-9223372014.581287 podStartE2EDuration="22.273487603s" podCreationTimestamp="2026-03-19 16:58:30 +0000 UTC" firstStartedPulling="2026-03-19 16:58:34.340483902 +0000 UTC m=+1126.462683150" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:58:52.268593979 +0000 UTC m=+1144.390793227" watchObservedRunningTime="2026-03-19 16:58:52.273487603 +0000 UTC m=+1144.395686851" Mar 19 16:58:53 crc kubenswrapper[4918]: I0319 16:58:53.228240 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"fd357519-ae6b-45ec-a8e1-dfc0c060be13","Type":"ContainerStarted","Data":"e32b66410d5e1405edf7b55f469a1fa2ce8d427db89bf245dfb33b9b25d7ca90"} Mar 19 16:58:55 crc kubenswrapper[4918]: I0319 16:58:55.248675 4918 generic.go:334] "Generic (PLEG): container finished" podID="22f16181-1900-453e-a97a-d3da7960a1cf" containerID="cf7c3ba42f63f4fe602cb18c525732e5cc1382335080009c548cf1363434e56b" exitCode=0 Mar 19 16:58:55 crc kubenswrapper[4918]: I0319 16:58:55.248806 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"22f16181-1900-453e-a97a-d3da7960a1cf","Type":"ContainerDied","Data":"cf7c3ba42f63f4fe602cb18c525732e5cc1382335080009c548cf1363434e56b"} Mar 19 16:58:55 crc kubenswrapper[4918]: I0319 16:58:55.934682 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 19 16:58:56 crc kubenswrapper[4918]: I0319 16:58:56.256793 4918 generic.go:334] "Generic (PLEG): container finished" podID="31a9da2c-83a7-408e-bae2-66a7097081ff" containerID="f69a576d43a003f077a373b93182d4866b7fe77c45cc0f98f71e8c237fcd6f59" exitCode=0 Mar 19 16:58:56 crc kubenswrapper[4918]: I0319 16:58:56.256939 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31a9da2c-83a7-408e-bae2-66a7097081ff","Type":"ContainerDied","Data":"f69a576d43a003f077a373b93182d4866b7fe77c45cc0f98f71e8c237fcd6f59"} Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.287705 4918 generic.go:334] "Generic (PLEG): container finished" podID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerID="c370355f645a1c2288afd62e6eb7c991dca804e2751677e640a89aed254fedea" exitCode=0 Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.287821 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"08c86067-0c7f-47a2-a2d4-e29ad43c539f","Type":"ContainerDied","Data":"c370355f645a1c2288afd62e6eb7c991dca804e2751677e640a89aed254fedea"} Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.290135 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31a9da2c-83a7-408e-bae2-66a7097081ff","Type":"ContainerStarted","Data":"e7957bde6074c7b0a6ff7321b3dee4f73c82c412e6e403888082ccbc85b869c6"} Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.294133 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"22f16181-1900-453e-a97a-d3da7960a1cf","Type":"ContainerStarted","Data":"d64fb8434b41257ac7ea4f9f326cdcdf5766d6a1248b46df9843e46a50507532"} Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.301325 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fc6b3398-7f5f-4485-9826-fbb92f8f26e2","Type":"ContainerStarted","Data":"4852553593ba4c6ab51c39f39dfd17ffd7ec9734cebaf6456dfe45068749b987"} Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.303622 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1818f96e-6152-49a9-b6fc-726d7677112c","Type":"ContainerStarted","Data":"383cee5d641523407392efe99ff20a4a5996ffdb2fa9e7b2a15c5f4fa7771f84"} Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.303851 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.306010 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5e1232f6-b41e-443e-b96e-e38929f077d4","Type":"ContainerStarted","Data":"4ad0f178cb0573e502509b24143122d6533c0b128e3416f8d7517364a9612abb"} Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.308623 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kt2zs" event={"ID":"13525212-7d91-453f-a80d-2e6a8febb21e","Type":"ContainerStarted","Data":"427769ecdff7c758060cd7af82d5e25ceb3a7aafcaf2ca440f73887711787b2c"} Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.308671 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kt2zs" event={"ID":"13525212-7d91-453f-a80d-2e6a8febb21e","Type":"ContainerStarted","Data":"baf09b59fb16614f319c6f6d998a021456f81224d437ea6daf1921921d7530dd"} Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.309478 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.309511 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.367706 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.624777592000001 podStartE2EDuration="32.367681139s" podCreationTimestamp="2026-03-19 16:58:25 +0000 UTC" firstStartedPulling="2026-03-19 16:58:34.358060787 +0000 UTC m=+1126.480260035" lastFinishedPulling="2026-03-19 16:58:56.100964324 +0000 UTC m=+1148.223163582" observedRunningTime="2026-03-19 16:58:57.349987003 +0000 UTC m=+1149.472186281" watchObservedRunningTime="2026-03-19 16:58:57.367681139 +0000 UTC m=+1149.489880407" Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.382569 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-kt2zs" podStartSLOduration=24.219237951 podStartE2EDuration="31.382547229s" podCreationTimestamp="2026-03-19 16:58:26 +0000 UTC" firstStartedPulling="2026-03-19 16:58:41.397293029 +0000 UTC m=+1133.519492277" lastFinishedPulling="2026-03-19 16:58:48.560602307 +0000 UTC m=+1140.682801555" observedRunningTime="2026-03-19 16:58:57.379566707 +0000 UTC m=+1149.501765975" watchObservedRunningTime="2026-03-19 16:58:57.382547229 +0000 UTC m=+1149.504746487" Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.407009 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=24.960889142 podStartE2EDuration="40.406989452s" podCreationTimestamp="2026-03-19 16:58:17 +0000 UTC" firstStartedPulling="2026-03-19 16:58:32.993343313 +0000 UTC m=+1125.115542561" lastFinishedPulling="2026-03-19 16:58:48.439443623 +0000 UTC m=+1140.561642871" observedRunningTime="2026-03-19 16:58:57.398482127 +0000 UTC m=+1149.520681385" watchObservedRunningTime="2026-03-19 16:58:57.406989452 +0000 UTC m=+1149.529188700" Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.427209 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.411575148 podStartE2EDuration="27.427194108s" podCreationTimestamp="2026-03-19 16:58:30 +0000 UTC" firstStartedPulling="2026-03-19 16:58:38.972339103 +0000 UTC m=+1131.094538391" lastFinishedPulling="2026-03-19 16:58:55.987958093 +0000 UTC m=+1148.110157351" observedRunningTime="2026-03-19 16:58:57.422368766 +0000 UTC m=+1149.544568014" watchObservedRunningTime="2026-03-19 16:58:57.427194108 +0000 UTC m=+1149.549393356" Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.446653 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.819377079 podStartE2EDuration="35.446632443s" podCreationTimestamp="2026-03-19 16:58:22 +0000 UTC" firstStartedPulling="2026-03-19 16:58:34.366287453 +0000 UTC m=+1126.488486701" lastFinishedPulling="2026-03-19 16:58:55.993542807 +0000 UTC m=+1148.115742065" observedRunningTime="2026-03-19 16:58:57.44581587 +0000 UTC m=+1149.568015118" watchObservedRunningTime="2026-03-19 16:58:57.446632443 +0000 UTC m=+1149.568831681" Mar 19 16:58:57 crc kubenswrapper[4918]: I0319 16:58:57.469065 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=23.508262519 podStartE2EDuration="38.46904715s" podCreationTimestamp="2026-03-19 16:58:19 +0000 UTC" firstStartedPulling="2026-03-19 16:58:33.801333203 +0000 UTC m=+1125.923532471" lastFinishedPulling="2026-03-19 16:58:48.762117854 +0000 UTC m=+1140.884317102" observedRunningTime="2026-03-19 16:58:57.463318972 +0000 UTC m=+1149.585518260" watchObservedRunningTime="2026-03-19 16:58:57.46904715 +0000 UTC m=+1149.591246398" Mar 19 16:58:58 crc kubenswrapper[4918]: I0319 16:58:58.212092 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:58:58 crc kubenswrapper[4918]: I0319 16:58:58.212174 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.026719 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.069113 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.224665 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.224716 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.325935 4918 generic.go:334] "Generic (PLEG): container finished" podID="fd357519-ae6b-45ec-a8e1-dfc0c060be13" containerID="e32b66410d5e1405edf7b55f469a1fa2ce8d427db89bf245dfb33b9b25d7ca90" exitCode=0 Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.326960 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"fd357519-ae6b-45ec-a8e1-dfc0c060be13","Type":"ContainerDied","Data":"e32b66410d5e1405edf7b55f469a1fa2ce8d427db89bf245dfb33b9b25d7ca90"} Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.327004 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.470247 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.771214 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-djfdr"] Mar 19 16:58:59 crc kubenswrapper[4918]: E0319 16:58:59.771666 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91884a89-5ccc-40aa-953a-f1cef948a1f9" containerName="init" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.771697 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="91884a89-5ccc-40aa-953a-f1cef948a1f9" containerName="init" Mar 19 16:58:59 crc kubenswrapper[4918]: E0319 16:58:59.771723 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91884a89-5ccc-40aa-953a-f1cef948a1f9" containerName="dnsmasq-dns" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.771731 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="91884a89-5ccc-40aa-953a-f1cef948a1f9" containerName="dnsmasq-dns" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.771997 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="91884a89-5ccc-40aa-953a-f1cef948a1f9" containerName="dnsmasq-dns" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.773181 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.777963 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-djfdr"] Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.780880 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.807891 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-q7kqm"] Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.809171 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.811368 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.863584 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-q7kqm"] Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.866967 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.881756 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.881805 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-config\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.881893 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9fa74868-5691-4d60-8d10-3e8dc1ddc776-ovn-rundir\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.882032 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.882089 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa74868-5691-4d60-8d10-3e8dc1ddc776-config\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.882113 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rbdw\" (UniqueName: \"kubernetes.io/projected/9fa74868-5691-4d60-8d10-3e8dc1ddc776-kube-api-access-9rbdw\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.882197 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrrgt\" (UniqueName: \"kubernetes.io/projected/7c0e4202-80f0-4e80-bf3b-8d78ec655622-kube-api-access-rrrgt\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.882232 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9fa74868-5691-4d60-8d10-3e8dc1ddc776-ovs-rundir\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.882274 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa74868-5691-4d60-8d10-3e8dc1ddc776-combined-ca-bundle\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.882390 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fa74868-5691-4d60-8d10-3e8dc1ddc776-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.924691 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.985025 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9fa74868-5691-4d60-8d10-3e8dc1ddc776-ovs-rundir\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.985249 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9fa74868-5691-4d60-8d10-3e8dc1ddc776-ovs-rundir\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.985379 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa74868-5691-4d60-8d10-3e8dc1ddc776-combined-ca-bundle\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.985495 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fa74868-5691-4d60-8d10-3e8dc1ddc776-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.985612 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.986161 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-config\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.986265 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9fa74868-5691-4d60-8d10-3e8dc1ddc776-ovn-rundir\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.986312 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.986343 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa74868-5691-4d60-8d10-3e8dc1ddc776-config\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.986363 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rbdw\" (UniqueName: \"kubernetes.io/projected/9fa74868-5691-4d60-8d10-3e8dc1ddc776-kube-api-access-9rbdw\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.986449 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrrgt\" (UniqueName: \"kubernetes.io/projected/7c0e4202-80f0-4e80-bf3b-8d78ec655622-kube-api-access-rrrgt\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.986718 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.987197 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa74868-5691-4d60-8d10-3e8dc1ddc776-config\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.987260 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-config\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.987422 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9fa74868-5691-4d60-8d10-3e8dc1ddc776-ovn-rundir\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:58:59 crc kubenswrapper[4918]: I0319 16:58:59.987956 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.004665 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fa74868-5691-4d60-8d10-3e8dc1ddc776-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.007887 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrrgt\" (UniqueName: \"kubernetes.io/projected/7c0e4202-80f0-4e80-bf3b-8d78ec655622-kube-api-access-rrrgt\") pod \"dnsmasq-dns-7f896c8c65-djfdr\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.009181 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rbdw\" (UniqueName: \"kubernetes.io/projected/9fa74868-5691-4d60-8d10-3e8dc1ddc776-kube-api-access-9rbdw\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.013174 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa74868-5691-4d60-8d10-3e8dc1ddc776-combined-ca-bundle\") pod \"ovn-controller-metrics-q7kqm\" (UID: \"9fa74868-5691-4d60-8d10-3e8dc1ddc776\") " pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.052906 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-djfdr"] Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.053681 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.095266 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5rmtw"] Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.096766 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.103265 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.111911 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5rmtw"] Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.134306 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-q7kqm" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.188926 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.188994 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.189341 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smmxt\" (UniqueName: \"kubernetes.io/projected/b15051ce-1edd-4e37-b3d7-687cbc60f529-kube-api-access-smmxt\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.189409 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-config\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.189441 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.292156 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smmxt\" (UniqueName: \"kubernetes.io/projected/b15051ce-1edd-4e37-b3d7-687cbc60f529-kube-api-access-smmxt\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.292495 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-config\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.292532 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.292578 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.292619 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.293341 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-config\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.301223 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.301787 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.302175 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.312439 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smmxt\" (UniqueName: \"kubernetes.io/projected/b15051ce-1edd-4e37-b3d7-687cbc60f529-kube-api-access-smmxt\") pod \"dnsmasq-dns-86db49b7ff-5rmtw\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.336804 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.382315 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.496724 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.533197 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-q7kqm"] Mar 19 16:59:00 crc kubenswrapper[4918]: W0319 16:59:00.571041 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fa74868_5691_4d60_8d10_3e8dc1ddc776.slice/crio-adae4ba02c923087d4c63b479594e01d47ae4c4389619702d89aec25902f73ec WatchSource:0}: Error finding container adae4ba02c923087d4c63b479594e01d47ae4c4389619702d89aec25902f73ec: Status 404 returned error can't find the container with id adae4ba02c923087d4c63b479594e01d47ae4c4389619702d89aec25902f73ec Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.605858 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.605901 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.605915 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-djfdr"] Mar 19 16:59:00 crc kubenswrapper[4918]: W0319 16:59:00.614802 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c0e4202_80f0_4e80_bf3b_8d78ec655622.slice/crio-b8bda6bd7040abf40b59b57157bb48856b9400605d80ae87babcfe0048a5b579 WatchSource:0}: Error finding container b8bda6bd7040abf40b59b57157bb48856b9400605d80ae87babcfe0048a5b579: Status 404 returned error can't find the container with id b8bda6bd7040abf40b59b57157bb48856b9400605d80ae87babcfe0048a5b579 Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.664097 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.665932 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.671290 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.671308 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hx4f7" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.671573 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.672122 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.713763 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.816606 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2b5f782-92ae-4f60-8d61-198e3008e01c-scripts\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.816645 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b5f782-92ae-4f60-8d61-198e3008e01c-config\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.816815 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b5f782-92ae-4f60-8d61-198e3008e01c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.817674 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2b5f782-92ae-4f60-8d61-198e3008e01c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.817739 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b5f782-92ae-4f60-8d61-198e3008e01c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.817785 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fb8g\" (UniqueName: \"kubernetes.io/projected/e2b5f782-92ae-4f60-8d61-198e3008e01c-kube-api-access-4fb8g\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.818200 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b5f782-92ae-4f60-8d61-198e3008e01c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.919770 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b5f782-92ae-4f60-8d61-198e3008e01c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.919861 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2b5f782-92ae-4f60-8d61-198e3008e01c-scripts\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.919887 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b5f782-92ae-4f60-8d61-198e3008e01c-config\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.919925 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b5f782-92ae-4f60-8d61-198e3008e01c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.919943 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2b5f782-92ae-4f60-8d61-198e3008e01c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.919973 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b5f782-92ae-4f60-8d61-198e3008e01c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.919995 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fb8g\" (UniqueName: \"kubernetes.io/projected/e2b5f782-92ae-4f60-8d61-198e3008e01c-kube-api-access-4fb8g\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.924171 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e2b5f782-92ae-4f60-8d61-198e3008e01c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.924501 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2b5f782-92ae-4f60-8d61-198e3008e01c-scripts\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.924609 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b5f782-92ae-4f60-8d61-198e3008e01c-config\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.926370 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b5f782-92ae-4f60-8d61-198e3008e01c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.927168 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b5f782-92ae-4f60-8d61-198e3008e01c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.927543 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b5f782-92ae-4f60-8d61-198e3008e01c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:00 crc kubenswrapper[4918]: I0319 16:59:00.935258 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fb8g\" (UniqueName: \"kubernetes.io/projected/e2b5f782-92ae-4f60-8d61-198e3008e01c-kube-api-access-4fb8g\") pod \"ovn-northd-0\" (UID: \"e2b5f782-92ae-4f60-8d61-198e3008e01c\") " pod="openstack/ovn-northd-0" Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.033169 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.127620 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5rmtw"] Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.350865 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-q7kqm" event={"ID":"9fa74868-5691-4d60-8d10-3e8dc1ddc776","Type":"ContainerStarted","Data":"4579a21f40a95a52743d3a2597da0f16397718bb6a647aad7405ae4bcf6c6893"} Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.351128 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-q7kqm" event={"ID":"9fa74868-5691-4d60-8d10-3e8dc1ddc776","Type":"ContainerStarted","Data":"adae4ba02c923087d4c63b479594e01d47ae4c4389619702d89aec25902f73ec"} Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.353310 4918 generic.go:334] "Generic (PLEG): container finished" podID="7c0e4202-80f0-4e80-bf3b-8d78ec655622" containerID="efdb4569e0e0e9a2601c7ea67b8fe5c3bfd5756c602e6de48f544daee55ec655" exitCode=0 Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.353368 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" event={"ID":"7c0e4202-80f0-4e80-bf3b-8d78ec655622","Type":"ContainerDied","Data":"efdb4569e0e0e9a2601c7ea67b8fe5c3bfd5756c602e6de48f544daee55ec655"} Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.353387 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" event={"ID":"7c0e4202-80f0-4e80-bf3b-8d78ec655622","Type":"ContainerStarted","Data":"b8bda6bd7040abf40b59b57157bb48856b9400605d80ae87babcfe0048a5b579"} Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.355925 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" event={"ID":"b15051ce-1edd-4e37-b3d7-687cbc60f529","Type":"ContainerStarted","Data":"2acb394ea2331256a314b2df63a0fa219907e2b4f454feb459af6afe155c7b5e"} Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.385618 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-q7kqm" podStartSLOduration=2.385598582 podStartE2EDuration="2.385598582s" podCreationTimestamp="2026-03-19 16:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:01.377279513 +0000 UTC m=+1153.499478761" watchObservedRunningTime="2026-03-19 16:59:01.385598582 +0000 UTC m=+1153.507797830" Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.708771 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 16:59:01 crc kubenswrapper[4918]: W0319 16:59:01.723070 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2b5f782_92ae_4f60_8d61_198e3008e01c.slice/crio-68e92e0194b9ea9a1143a35be56394e1dda7cbd818897329f9312f6024cf42f1 WatchSource:0}: Error finding container 68e92e0194b9ea9a1143a35be56394e1dda7cbd818897329f9312f6024cf42f1: Status 404 returned error can't find the container with id 68e92e0194b9ea9a1143a35be56394e1dda7cbd818897329f9312f6024cf42f1 Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.768249 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.947239 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-dns-svc\") pod \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.947539 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-config\") pod \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.947693 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrrgt\" (UniqueName: \"kubernetes.io/projected/7c0e4202-80f0-4e80-bf3b-8d78ec655622-kube-api-access-rrrgt\") pod \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.947737 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-ovsdbserver-sb\") pod \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\" (UID: \"7c0e4202-80f0-4e80-bf3b-8d78ec655622\") " Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.954863 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0e4202-80f0-4e80-bf3b-8d78ec655622-kube-api-access-rrrgt" (OuterVolumeSpecName: "kube-api-access-rrrgt") pod "7c0e4202-80f0-4e80-bf3b-8d78ec655622" (UID: "7c0e4202-80f0-4e80-bf3b-8d78ec655622"). InnerVolumeSpecName "kube-api-access-rrrgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.968829 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7c0e4202-80f0-4e80-bf3b-8d78ec655622" (UID: "7c0e4202-80f0-4e80-bf3b-8d78ec655622"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.969171 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c0e4202-80f0-4e80-bf3b-8d78ec655622" (UID: "7c0e4202-80f0-4e80-bf3b-8d78ec655622"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:01 crc kubenswrapper[4918]: I0319 16:59:01.977561 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-config" (OuterVolumeSpecName: "config") pod "7c0e4202-80f0-4e80-bf3b-8d78ec655622" (UID: "7c0e4202-80f0-4e80-bf3b-8d78ec655622"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.049188 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrrgt\" (UniqueName: \"kubernetes.io/projected/7c0e4202-80f0-4e80-bf3b-8d78ec655622-kube-api-access-rrrgt\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.049222 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.049232 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.049244 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0e4202-80f0-4e80-bf3b-8d78ec655622-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.368729 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.368730 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-djfdr" event={"ID":"7c0e4202-80f0-4e80-bf3b-8d78ec655622","Type":"ContainerDied","Data":"b8bda6bd7040abf40b59b57157bb48856b9400605d80ae87babcfe0048a5b579"} Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.368869 4918 scope.go:117] "RemoveContainer" containerID="efdb4569e0e0e9a2601c7ea67b8fe5c3bfd5756c602e6de48f544daee55ec655" Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.372986 4918 generic.go:334] "Generic (PLEG): container finished" podID="b15051ce-1edd-4e37-b3d7-687cbc60f529" containerID="ee28d1e35e48aa3e673c9f76639b015f6e349ef2093bcd43ce2532c8de75fca6" exitCode=0 Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.373074 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" event={"ID":"b15051ce-1edd-4e37-b3d7-687cbc60f529","Type":"ContainerDied","Data":"ee28d1e35e48aa3e673c9f76639b015f6e349ef2093bcd43ce2532c8de75fca6"} Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.376344 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e2b5f782-92ae-4f60-8d61-198e3008e01c","Type":"ContainerStarted","Data":"68e92e0194b9ea9a1143a35be56394e1dda7cbd818897329f9312f6024cf42f1"} Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.441260 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-djfdr"] Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.453975 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-djfdr"] Mar 19 16:59:02 crc kubenswrapper[4918]: I0319 16:59:02.598670 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0e4202-80f0-4e80-bf3b-8d78ec655622" path="/var/lib/kubelet/pods/7c0e4202-80f0-4e80-bf3b-8d78ec655622/volumes" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.059562 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.131355 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5rmtw"] Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.132203 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.164057 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-pbmts"] Mar 19 16:59:03 crc kubenswrapper[4918]: E0319 16:59:03.164444 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0e4202-80f0-4e80-bf3b-8d78ec655622" containerName="init" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.164455 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0e4202-80f0-4e80-bf3b-8d78ec655622" containerName="init" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.164651 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0e4202-80f0-4e80-bf3b-8d78ec655622" containerName="init" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.166503 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.276821 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pbmts"] Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.278560 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-config\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.278742 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-dns-svc\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.278832 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.278912 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.278989 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhvmh\" (UniqueName: \"kubernetes.io/projected/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-kube-api-access-jhvmh\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.322898 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.328361 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.390602 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-config\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.390668 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-dns-svc\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.390707 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.390755 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.390780 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhvmh\" (UniqueName: \"kubernetes.io/projected/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-kube-api-access-jhvmh\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.392006 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.392306 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-dns-svc\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.392562 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.392846 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-config\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.462262 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhvmh\" (UniqueName: \"kubernetes.io/projected/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-kube-api-access-jhvmh\") pod \"dnsmasq-dns-698758b865-pbmts\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.542969 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:03 crc kubenswrapper[4918]: I0319 16:59:03.659561 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.280379 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.287676 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.290587 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.291392 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.291650 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.291848 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-c4mc9" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.304740 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.326201 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pbmts"] Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.407501 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"fd357519-ae6b-45ec-a8e1-dfc0c060be13","Type":"ContainerStarted","Data":"6387f90d9d404595ceff59c9366e930da3dc81823e1138ffdd7cd81dd0ae500b"} Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.412905 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" event={"ID":"b15051ce-1edd-4e37-b3d7-687cbc60f529","Type":"ContainerStarted","Data":"0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7"} Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.413064 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" podUID="b15051ce-1edd-4e37-b3d7-687cbc60f529" containerName="dnsmasq-dns" containerID="cri-o://0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7" gracePeriod=10 Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.413254 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.425440 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4118384-38ad-465d-a81e-62bf39cc6cec-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.425503 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e4118384-38ad-465d-a81e-62bf39cc6cec-lock\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.425576 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5zlf\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-kube-api-access-x5zlf\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.425630 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.425647 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e4118384-38ad-465d-a81e-62bf39cc6cec-cache\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.425720 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8067c942-1cee-4355-81be-c83829733d3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8067c942-1cee-4355-81be-c83829733d3f\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.434441 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" podStartSLOduration=4.434420451 podStartE2EDuration="4.434420451s" podCreationTimestamp="2026-03-19 16:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:04.431724406 +0000 UTC m=+1156.553923674" watchObservedRunningTime="2026-03-19 16:59:04.434420451 +0000 UTC m=+1156.556619699" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.527138 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.527239 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e4118384-38ad-465d-a81e-62bf39cc6cec-cache\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: E0319 16:59:04.527743 4918 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 16:59:04 crc kubenswrapper[4918]: E0319 16:59:04.527777 4918 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.527809 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8067c942-1cee-4355-81be-c83829733d3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8067c942-1cee-4355-81be-c83829733d3f\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: E0319 16:59:04.527834 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift podName:e4118384-38ad-465d-a81e-62bf39cc6cec nodeName:}" failed. No retries permitted until 2026-03-19 16:59:05.027813541 +0000 UTC m=+1157.150012789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift") pod "swift-storage-0" (UID: "e4118384-38ad-465d-a81e-62bf39cc6cec") : configmap "swift-ring-files" not found Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.527884 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4118384-38ad-465d-a81e-62bf39cc6cec-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.527909 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e4118384-38ad-465d-a81e-62bf39cc6cec-lock\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.527937 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5zlf\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-kube-api-access-x5zlf\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.528023 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e4118384-38ad-465d-a81e-62bf39cc6cec-cache\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.528257 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e4118384-38ad-465d-a81e-62bf39cc6cec-lock\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.530733 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.530769 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8067c942-1cee-4355-81be-c83829733d3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8067c942-1cee-4355-81be-c83829733d3f\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/892a09380b7ef74d59822e1eb3aa05f0ef8fc531afef99aafaba714154cec84e/globalmount\"" pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.532901 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4118384-38ad-465d-a81e-62bf39cc6cec-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.547386 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5zlf\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-kube-api-access-x5zlf\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.579438 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8067c942-1cee-4355-81be-c83829733d3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8067c942-1cee-4355-81be-c83829733d3f\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:04 crc kubenswrapper[4918]: I0319 16:59:04.981570 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.041482 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-config\") pod \"b15051ce-1edd-4e37-b3d7-687cbc60f529\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.041818 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-nb\") pod \"b15051ce-1edd-4e37-b3d7-687cbc60f529\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.042040 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-dns-svc\") pod \"b15051ce-1edd-4e37-b3d7-687cbc60f529\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.042762 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smmxt\" (UniqueName: \"kubernetes.io/projected/b15051ce-1edd-4e37-b3d7-687cbc60f529-kube-api-access-smmxt\") pod \"b15051ce-1edd-4e37-b3d7-687cbc60f529\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.043504 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-sb\") pod \"b15051ce-1edd-4e37-b3d7-687cbc60f529\" (UID: \"b15051ce-1edd-4e37-b3d7-687cbc60f529\") " Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.045156 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:05 crc kubenswrapper[4918]: E0319 16:59:05.045409 4918 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 16:59:05 crc kubenswrapper[4918]: E0319 16:59:05.045432 4918 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 16:59:05 crc kubenswrapper[4918]: E0319 16:59:05.045488 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift podName:e4118384-38ad-465d-a81e-62bf39cc6cec nodeName:}" failed. No retries permitted until 2026-03-19 16:59:06.045473919 +0000 UTC m=+1158.167673167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift") pod "swift-storage-0" (UID: "e4118384-38ad-465d-a81e-62bf39cc6cec") : configmap "swift-ring-files" not found Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.046112 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b15051ce-1edd-4e37-b3d7-687cbc60f529-kube-api-access-smmxt" (OuterVolumeSpecName: "kube-api-access-smmxt") pod "b15051ce-1edd-4e37-b3d7-687cbc60f529" (UID: "b15051ce-1edd-4e37-b3d7-687cbc60f529"). InnerVolumeSpecName "kube-api-access-smmxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.095297 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-config" (OuterVolumeSpecName: "config") pod "b15051ce-1edd-4e37-b3d7-687cbc60f529" (UID: "b15051ce-1edd-4e37-b3d7-687cbc60f529"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.104657 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b15051ce-1edd-4e37-b3d7-687cbc60f529" (UID: "b15051ce-1edd-4e37-b3d7-687cbc60f529"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.110070 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b15051ce-1edd-4e37-b3d7-687cbc60f529" (UID: "b15051ce-1edd-4e37-b3d7-687cbc60f529"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.113275 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b15051ce-1edd-4e37-b3d7-687cbc60f529" (UID: "b15051ce-1edd-4e37-b3d7-687cbc60f529"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.158851 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.158885 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.158899 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.158909 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smmxt\" (UniqueName: \"kubernetes.io/projected/b15051ce-1edd-4e37-b3d7-687cbc60f529-kube-api-access-smmxt\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.158920 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b15051ce-1edd-4e37-b3d7-687cbc60f529-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.436846 4918 generic.go:334] "Generic (PLEG): container finished" podID="b15051ce-1edd-4e37-b3d7-687cbc60f529" containerID="0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7" exitCode=0 Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.436939 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" event={"ID":"b15051ce-1edd-4e37-b3d7-687cbc60f529","Type":"ContainerDied","Data":"0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7"} Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.436972 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" event={"ID":"b15051ce-1edd-4e37-b3d7-687cbc60f529","Type":"ContainerDied","Data":"2acb394ea2331256a314b2df63a0fa219907e2b4f454feb459af6afe155c7b5e"} Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.436992 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-5rmtw" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.436997 4918 scope.go:117] "RemoveContainer" containerID="0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.447203 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e2b5f782-92ae-4f60-8d61-198e3008e01c","Type":"ContainerStarted","Data":"ecf28ea289a739911d168e9123e3eaef4a65df540a247aa370ebdeca59029f4d"} Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.447365 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.447398 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e2b5f782-92ae-4f60-8d61-198e3008e01c","Type":"ContainerStarted","Data":"a067a6c7303d39782dd89a6b283828f268b049c8e66837972284424d325840d2"} Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.449623 4918 generic.go:334] "Generic (PLEG): container finished" podID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerID="d6eaf3940247a852f3fd2522e7139b00f08524536da3ad5d82385069d3b1de45" exitCode=0 Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.449684 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pbmts" event={"ID":"f01a3cda-80e8-4b64-9496-c1ec001e0e9d","Type":"ContainerDied","Data":"d6eaf3940247a852f3fd2522e7139b00f08524536da3ad5d82385069d3b1de45"} Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.449721 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pbmts" event={"ID":"f01a3cda-80e8-4b64-9496-c1ec001e0e9d","Type":"ContainerStarted","Data":"fe1adf8ab10e7380083f67a548df01d3d034d98f84df8f4533f5614d953d6640"} Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.511388 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.519909384 podStartE2EDuration="5.511366663s" podCreationTimestamp="2026-03-19 16:59:00 +0000 UTC" firstStartedPulling="2026-03-19 16:59:01.728239183 +0000 UTC m=+1153.850438431" lastFinishedPulling="2026-03-19 16:59:04.719696462 +0000 UTC m=+1156.841895710" observedRunningTime="2026-03-19 16:59:05.471061954 +0000 UTC m=+1157.593261202" watchObservedRunningTime="2026-03-19 16:59:05.511366663 +0000 UTC m=+1157.633565921" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.807168 4918 scope.go:117] "RemoveContainer" containerID="ee28d1e35e48aa3e673c9f76639b015f6e349ef2093bcd43ce2532c8de75fca6" Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.835608 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5rmtw"] Mar 19 16:59:05 crc kubenswrapper[4918]: I0319 16:59:05.843747 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5rmtw"] Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.018433 4918 scope.go:117] "RemoveContainer" containerID="0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7" Mar 19 16:59:06 crc kubenswrapper[4918]: E0319 16:59:06.019986 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7\": container with ID starting with 0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7 not found: ID does not exist" containerID="0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.020042 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7"} err="failed to get container status \"0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7\": rpc error: code = NotFound desc = could not find container \"0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7\": container with ID starting with 0d39da66b4795f61c3f41599ceea5eeac7131d61fb3744b1bd2f2137c9dd2cf7 not found: ID does not exist" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.020075 4918 scope.go:117] "RemoveContainer" containerID="ee28d1e35e48aa3e673c9f76639b015f6e349ef2093bcd43ce2532c8de75fca6" Mar 19 16:59:06 crc kubenswrapper[4918]: E0319 16:59:06.020393 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee28d1e35e48aa3e673c9f76639b015f6e349ef2093bcd43ce2532c8de75fca6\": container with ID starting with ee28d1e35e48aa3e673c9f76639b015f6e349ef2093bcd43ce2532c8de75fca6 not found: ID does not exist" containerID="ee28d1e35e48aa3e673c9f76639b015f6e349ef2093bcd43ce2532c8de75fca6" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.020419 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee28d1e35e48aa3e673c9f76639b015f6e349ef2093bcd43ce2532c8de75fca6"} err="failed to get container status \"ee28d1e35e48aa3e673c9f76639b015f6e349ef2093bcd43ce2532c8de75fca6\": rpc error: code = NotFound desc = could not find container \"ee28d1e35e48aa3e673c9f76639b015f6e349ef2093bcd43ce2532c8de75fca6\": container with ID starting with ee28d1e35e48aa3e673c9f76639b015f6e349ef2093bcd43ce2532c8de75fca6 not found: ID does not exist" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.073981 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:06 crc kubenswrapper[4918]: E0319 16:59:06.074200 4918 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 16:59:06 crc kubenswrapper[4918]: E0319 16:59:06.074238 4918 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 16:59:06 crc kubenswrapper[4918]: E0319 16:59:06.074308 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift podName:e4118384-38ad-465d-a81e-62bf39cc6cec nodeName:}" failed. No retries permitted until 2026-03-19 16:59:08.074288617 +0000 UTC m=+1160.196487865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift") pod "swift-storage-0" (UID: "e4118384-38ad-465d-a81e-62bf39cc6cec") : configmap "swift-ring-files" not found Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.204613 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-mhhrx"] Mar 19 16:59:06 crc kubenswrapper[4918]: E0319 16:59:06.205307 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15051ce-1edd-4e37-b3d7-687cbc60f529" containerName="dnsmasq-dns" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.205330 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15051ce-1edd-4e37-b3d7-687cbc60f529" containerName="dnsmasq-dns" Mar 19 16:59:06 crc kubenswrapper[4918]: E0319 16:59:06.205351 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15051ce-1edd-4e37-b3d7-687cbc60f529" containerName="init" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.205360 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15051ce-1edd-4e37-b3d7-687cbc60f529" containerName="init" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.205620 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15051ce-1edd-4e37-b3d7-687cbc60f529" containerName="dnsmasq-dns" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.206361 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mhhrx" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.227693 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b002-account-create-update-cq2hz"] Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.228847 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b002-account-create-update-cq2hz" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.235973 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mhhrx"] Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.236874 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.244442 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b002-account-create-update-cq2hz"] Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.277454 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b0d9347-5fad-4ca6-8adc-0850f537067b-operator-scripts\") pod \"glance-db-create-mhhrx\" (UID: \"3b0d9347-5fad-4ca6-8adc-0850f537067b\") " pod="openstack/glance-db-create-mhhrx" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.277831 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r94r\" (UniqueName: \"kubernetes.io/projected/313b84b1-8c9f-4a76-90dd-923d9ba8f469-kube-api-access-2r94r\") pod \"glance-b002-account-create-update-cq2hz\" (UID: \"313b84b1-8c9f-4a76-90dd-923d9ba8f469\") " pod="openstack/glance-b002-account-create-update-cq2hz" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.277879 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwrqt\" (UniqueName: \"kubernetes.io/projected/3b0d9347-5fad-4ca6-8adc-0850f537067b-kube-api-access-nwrqt\") pod \"glance-db-create-mhhrx\" (UID: \"3b0d9347-5fad-4ca6-8adc-0850f537067b\") " pod="openstack/glance-db-create-mhhrx" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.277942 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/313b84b1-8c9f-4a76-90dd-923d9ba8f469-operator-scripts\") pod \"glance-b002-account-create-update-cq2hz\" (UID: \"313b84b1-8c9f-4a76-90dd-923d9ba8f469\") " pod="openstack/glance-b002-account-create-update-cq2hz" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.379391 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r94r\" (UniqueName: \"kubernetes.io/projected/313b84b1-8c9f-4a76-90dd-923d9ba8f469-kube-api-access-2r94r\") pod \"glance-b002-account-create-update-cq2hz\" (UID: \"313b84b1-8c9f-4a76-90dd-923d9ba8f469\") " pod="openstack/glance-b002-account-create-update-cq2hz" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.379455 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwrqt\" (UniqueName: \"kubernetes.io/projected/3b0d9347-5fad-4ca6-8adc-0850f537067b-kube-api-access-nwrqt\") pod \"glance-db-create-mhhrx\" (UID: \"3b0d9347-5fad-4ca6-8adc-0850f537067b\") " pod="openstack/glance-db-create-mhhrx" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.379545 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/313b84b1-8c9f-4a76-90dd-923d9ba8f469-operator-scripts\") pod \"glance-b002-account-create-update-cq2hz\" (UID: \"313b84b1-8c9f-4a76-90dd-923d9ba8f469\") " pod="openstack/glance-b002-account-create-update-cq2hz" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.379599 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b0d9347-5fad-4ca6-8adc-0850f537067b-operator-scripts\") pod \"glance-db-create-mhhrx\" (UID: \"3b0d9347-5fad-4ca6-8adc-0850f537067b\") " pod="openstack/glance-db-create-mhhrx" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.380456 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b0d9347-5fad-4ca6-8adc-0850f537067b-operator-scripts\") pod \"glance-db-create-mhhrx\" (UID: \"3b0d9347-5fad-4ca6-8adc-0850f537067b\") " pod="openstack/glance-db-create-mhhrx" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.380691 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/313b84b1-8c9f-4a76-90dd-923d9ba8f469-operator-scripts\") pod \"glance-b002-account-create-update-cq2hz\" (UID: \"313b84b1-8c9f-4a76-90dd-923d9ba8f469\") " pod="openstack/glance-b002-account-create-update-cq2hz" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.397701 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwrqt\" (UniqueName: \"kubernetes.io/projected/3b0d9347-5fad-4ca6-8adc-0850f537067b-kube-api-access-nwrqt\") pod \"glance-db-create-mhhrx\" (UID: \"3b0d9347-5fad-4ca6-8adc-0850f537067b\") " pod="openstack/glance-db-create-mhhrx" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.398268 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r94r\" (UniqueName: \"kubernetes.io/projected/313b84b1-8c9f-4a76-90dd-923d9ba8f469-kube-api-access-2r94r\") pod \"glance-b002-account-create-update-cq2hz\" (UID: \"313b84b1-8c9f-4a76-90dd-923d9ba8f469\") " pod="openstack/glance-b002-account-create-update-cq2hz" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.461179 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pbmts" event={"ID":"f01a3cda-80e8-4b64-9496-c1ec001e0e9d","Type":"ContainerStarted","Data":"45c68780a1d7a2b68a09da826e86cebdba1ce0b0d9250d5f3a787e520da606eb"} Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.462227 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.465794 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"fd357519-ae6b-45ec-a8e1-dfc0c060be13","Type":"ContainerStarted","Data":"1b910b75f12827017c1e228fa3a434912138ce49b3d3af3e92ae4650208b4926"} Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.491320 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-pbmts" podStartSLOduration=3.491300346 podStartE2EDuration="3.491300346s" podCreationTimestamp="2026-03-19 16:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:06.483187222 +0000 UTC m=+1158.605386470" watchObservedRunningTime="2026-03-19 16:59:06.491300346 +0000 UTC m=+1158.613499594" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.522177 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mhhrx" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.522407 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=13.484395894 podStartE2EDuration="43.522387141s" podCreationTimestamp="2026-03-19 16:58:23 +0000 UTC" firstStartedPulling="2026-03-19 16:58:33.801906399 +0000 UTC m=+1125.924105647" lastFinishedPulling="2026-03-19 16:59:03.839897646 +0000 UTC m=+1155.962096894" observedRunningTime="2026-03-19 16:59:06.513711893 +0000 UTC m=+1158.635911161" watchObservedRunningTime="2026-03-19 16:59:06.522387141 +0000 UTC m=+1158.644586389" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.544055 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b002-account-create-update-cq2hz" Mar 19 16:59:06 crc kubenswrapper[4918]: I0319 16:59:06.599203 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b15051ce-1edd-4e37-b3d7-687cbc60f529" path="/var/lib/kubelet/pods/b15051ce-1edd-4e37-b3d7-687cbc60f529/volumes" Mar 19 16:59:07 crc kubenswrapper[4918]: I0319 16:59:07.477206 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 19 16:59:07 crc kubenswrapper[4918]: I0319 16:59:07.479993 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 19 16:59:07 crc kubenswrapper[4918]: I0319 16:59:07.840856 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-z6g8d"] Mar 19 16:59:07 crc kubenswrapper[4918]: I0319 16:59:07.842194 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6g8d" Mar 19 16:59:07 crc kubenswrapper[4918]: I0319 16:59:07.843914 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 19 16:59:07 crc kubenswrapper[4918]: I0319 16:59:07.850954 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z6g8d"] Mar 19 16:59:07 crc kubenswrapper[4918]: I0319 16:59:07.923867 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrmgf\" (UniqueName: \"kubernetes.io/projected/389ba586-4f0f-4be2-b731-903fc1cfe234-kube-api-access-hrmgf\") pod \"root-account-create-update-z6g8d\" (UID: \"389ba586-4f0f-4be2-b731-903fc1cfe234\") " pod="openstack/root-account-create-update-z6g8d" Mar 19 16:59:07 crc kubenswrapper[4918]: I0319 16:59:07.924039 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389ba586-4f0f-4be2-b731-903fc1cfe234-operator-scripts\") pod \"root-account-create-update-z6g8d\" (UID: \"389ba586-4f0f-4be2-b731-903fc1cfe234\") " pod="openstack/root-account-create-update-z6g8d" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.025230 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrmgf\" (UniqueName: \"kubernetes.io/projected/389ba586-4f0f-4be2-b731-903fc1cfe234-kube-api-access-hrmgf\") pod \"root-account-create-update-z6g8d\" (UID: \"389ba586-4f0f-4be2-b731-903fc1cfe234\") " pod="openstack/root-account-create-update-z6g8d" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.025640 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389ba586-4f0f-4be2-b731-903fc1cfe234-operator-scripts\") pod \"root-account-create-update-z6g8d\" (UID: \"389ba586-4f0f-4be2-b731-903fc1cfe234\") " pod="openstack/root-account-create-update-z6g8d" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.026486 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389ba586-4f0f-4be2-b731-903fc1cfe234-operator-scripts\") pod \"root-account-create-update-z6g8d\" (UID: \"389ba586-4f0f-4be2-b731-903fc1cfe234\") " pod="openstack/root-account-create-update-z6g8d" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.047878 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrmgf\" (UniqueName: \"kubernetes.io/projected/389ba586-4f0f-4be2-b731-903fc1cfe234-kube-api-access-hrmgf\") pod \"root-account-create-update-z6g8d\" (UID: \"389ba586-4f0f-4be2-b731-903fc1cfe234\") " pod="openstack/root-account-create-update-z6g8d" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.126974 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:08 crc kubenswrapper[4918]: E0319 16:59:08.127254 4918 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 16:59:08 crc kubenswrapper[4918]: E0319 16:59:08.127268 4918 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 16:59:08 crc kubenswrapper[4918]: E0319 16:59:08.127310 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift podName:e4118384-38ad-465d-a81e-62bf39cc6cec nodeName:}" failed. No retries permitted until 2026-03-19 16:59:12.127297406 +0000 UTC m=+1164.249496654 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift") pod "swift-storage-0" (UID: "e4118384-38ad-465d-a81e-62bf39cc6cec") : configmap "swift-ring-files" not found Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.166656 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6g8d" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.236534 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nmm4p"] Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.237836 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.241717 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.241729 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.241879 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.249037 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nmm4p"] Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.329839 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-scripts\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.329890 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nrb4\" (UniqueName: \"kubernetes.io/projected/7b21f92d-7895-46c0-a66d-9e0aedb15e72-kube-api-access-6nrb4\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.329917 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-dispersionconf\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.329945 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-combined-ca-bundle\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.329980 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b21f92d-7895-46c0-a66d-9e0aedb15e72-etc-swift\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.330022 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-ring-data-devices\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.330136 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-swiftconf\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.431917 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-dispersionconf\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.431975 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-combined-ca-bundle\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.432007 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b21f92d-7895-46c0-a66d-9e0aedb15e72-etc-swift\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.432043 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-ring-data-devices\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.432145 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-swiftconf\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.432207 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-scripts\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.432241 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nrb4\" (UniqueName: \"kubernetes.io/projected/7b21f92d-7895-46c0-a66d-9e0aedb15e72-kube-api-access-6nrb4\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.432501 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b21f92d-7895-46c0-a66d-9e0aedb15e72-etc-swift\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.432897 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-ring-data-devices\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.432919 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-scripts\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.436511 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-swiftconf\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.437833 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-combined-ca-bundle\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.444976 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-dispersionconf\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.451079 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nrb4\" (UniqueName: \"kubernetes.io/projected/7b21f92d-7895-46c0-a66d-9e0aedb15e72-kube-api-access-6nrb4\") pod \"swift-ring-rebalance-nmm4p\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:08 crc kubenswrapper[4918]: I0319 16:59:08.571540 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:09 crc kubenswrapper[4918]: I0319 16:59:09.787156 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b002-account-create-update-cq2hz"] Mar 19 16:59:09 crc kubenswrapper[4918]: W0319 16:59:09.794284 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod313b84b1_8c9f_4a76_90dd_923d9ba8f469.slice/crio-ea95d3709536136884cf74138bc93e567948fd7bd72c6d51272c20b93558d79b WatchSource:0}: Error finding container ea95d3709536136884cf74138bc93e567948fd7bd72c6d51272c20b93558d79b: Status 404 returned error can't find the container with id ea95d3709536136884cf74138bc93e567948fd7bd72c6d51272c20b93558d79b Mar 19 16:59:09 crc kubenswrapper[4918]: I0319 16:59:09.796493 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z6g8d"] Mar 19 16:59:09 crc kubenswrapper[4918]: W0319 16:59:09.799270 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod389ba586_4f0f_4be2_b731_903fc1cfe234.slice/crio-696537eb5fb7547c197489342c622316f41ca25f0300d0288c3a4b2946060b32 WatchSource:0}: Error finding container 696537eb5fb7547c197489342c622316f41ca25f0300d0288c3a4b2946060b32: Status 404 returned error can't find the container with id 696537eb5fb7547c197489342c622316f41ca25f0300d0288c3a4b2946060b32 Mar 19 16:59:09 crc kubenswrapper[4918]: I0319 16:59:09.806440 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nmm4p"] Mar 19 16:59:09 crc kubenswrapper[4918]: I0319 16:59:09.929860 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mhhrx"] Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.328911 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4" Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.525707 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-668f98fdd7-xzqfn" Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.537958 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nmm4p" event={"ID":"7b21f92d-7895-46c0-a66d-9e0aedb15e72","Type":"ContainerStarted","Data":"5550d4cc16aa1357b72a01439f179649afedbcce867d813b3fcda3d9b71207d2"} Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.539878 4918 generic.go:334] "Generic (PLEG): container finished" podID="3b0d9347-5fad-4ca6-8adc-0850f537067b" containerID="71aa12cca7c8e17912d185902652440b8bc41f140ed7f31fd63116e5f0792eb0" exitCode=0 Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.540046 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mhhrx" event={"ID":"3b0d9347-5fad-4ca6-8adc-0850f537067b","Type":"ContainerDied","Data":"71aa12cca7c8e17912d185902652440b8bc41f140ed7f31fd63116e5f0792eb0"} Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.540161 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mhhrx" event={"ID":"3b0d9347-5fad-4ca6-8adc-0850f537067b","Type":"ContainerStarted","Data":"67c82e0e9780766daa37536b8f48edfcf52470cc398389e50a0f685d8e9b996e"} Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.566202 4918 generic.go:334] "Generic (PLEG): container finished" podID="389ba586-4f0f-4be2-b731-903fc1cfe234" containerID="515624c2881c3bc1eb5ed0ac7af05a88fee84107fbab75f48617d5c7c8a9b307" exitCode=0 Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.566471 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6g8d" event={"ID":"389ba586-4f0f-4be2-b731-903fc1cfe234","Type":"ContainerDied","Data":"515624c2881c3bc1eb5ed0ac7af05a88fee84107fbab75f48617d5c7c8a9b307"} Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.566643 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6g8d" event={"ID":"389ba586-4f0f-4be2-b731-903fc1cfe234","Type":"ContainerStarted","Data":"696537eb5fb7547c197489342c622316f41ca25f0300d0288c3a4b2946060b32"} Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.576555 4918 generic.go:334] "Generic (PLEG): container finished" podID="313b84b1-8c9f-4a76-90dd-923d9ba8f469" containerID="71defd4c48e052b1a32122ae7d93eb7f3040b1baa7a1668e40c6739d004d6dcc" exitCode=0 Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.576701 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b002-account-create-update-cq2hz" event={"ID":"313b84b1-8c9f-4a76-90dd-923d9ba8f469","Type":"ContainerDied","Data":"71defd4c48e052b1a32122ae7d93eb7f3040b1baa7a1668e40c6739d004d6dcc"} Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.577002 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b002-account-create-update-cq2hz" event={"ID":"313b84b1-8c9f-4a76-90dd-923d9ba8f469","Type":"ContainerStarted","Data":"ea95d3709536136884cf74138bc93e567948fd7bd72c6d51272c20b93558d79b"} Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.644459 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"08c86067-0c7f-47a2-a2d4-e29ad43c539f","Type":"ContainerStarted","Data":"b8e1cfe90949749e2b18f25ef2bbcf1099b0653e409299865e87c877b9f7c64a"} Mar 19 16:59:10 crc kubenswrapper[4918]: I0319 16:59:10.670832 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-6f54889599-ljlbj" Mar 19 16:59:11 crc kubenswrapper[4918]: I0319 16:59:11.499279 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="142e9778-542e-491b-95f2-8a63e76c4271" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 16:59:11 crc kubenswrapper[4918]: I0319 16:59:11.670441 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Mar 19 16:59:11 crc kubenswrapper[4918]: I0319 16:59:11.791720 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Mar 19 16:59:11 crc kubenswrapper[4918]: I0319 16:59:11.876391 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bp9zk"] Mar 19 16:59:11 crc kubenswrapper[4918]: I0319 16:59:11.880073 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bp9zk" Mar 19 16:59:11 crc kubenswrapper[4918]: I0319 16:59:11.888902 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bp9zk"] Mar 19 16:59:11 crc kubenswrapper[4918]: I0319 16:59:11.981626 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4694-account-create-update-chqcv"] Mar 19 16:59:11 crc kubenswrapper[4918]: I0319 16:59:11.984053 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4694-account-create-update-chqcv" Mar 19 16:59:11 crc kubenswrapper[4918]: I0319 16:59:11.988628 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 19 16:59:11 crc kubenswrapper[4918]: I0319 16:59:11.990336 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4694-account-create-update-chqcv"] Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:11.999659 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frls2\" (UniqueName: \"kubernetes.io/projected/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-kube-api-access-frls2\") pod \"keystone-db-create-bp9zk\" (UID: \"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a\") " pod="openstack/keystone-db-create-bp9zk" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:11.999731 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-operator-scripts\") pod \"keystone-db-create-bp9zk\" (UID: \"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a\") " pod="openstack/keystone-db-create-bp9zk" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.100864 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-operator-scripts\") pod \"keystone-db-create-bp9zk\" (UID: \"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a\") " pod="openstack/keystone-db-create-bp9zk" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.101903 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ddd71c6-2966-406d-af9e-122263ed9610-operator-scripts\") pod \"keystone-4694-account-create-update-chqcv\" (UID: \"0ddd71c6-2966-406d-af9e-122263ed9610\") " pod="openstack/keystone-4694-account-create-update-chqcv" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.101999 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x726\" (UniqueName: \"kubernetes.io/projected/0ddd71c6-2966-406d-af9e-122263ed9610-kube-api-access-4x726\") pod \"keystone-4694-account-create-update-chqcv\" (UID: \"0ddd71c6-2966-406d-af9e-122263ed9610\") " pod="openstack/keystone-4694-account-create-update-chqcv" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.102092 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frls2\" (UniqueName: \"kubernetes.io/projected/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-kube-api-access-frls2\") pod \"keystone-db-create-bp9zk\" (UID: \"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a\") " pod="openstack/keystone-db-create-bp9zk" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.104080 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-operator-scripts\") pod \"keystone-db-create-bp9zk\" (UID: \"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a\") " pod="openstack/keystone-db-create-bp9zk" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.141120 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frls2\" (UniqueName: \"kubernetes.io/projected/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-kube-api-access-frls2\") pod \"keystone-db-create-bp9zk\" (UID: \"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a\") " pod="openstack/keystone-db-create-bp9zk" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.187663 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jmqtm"] Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.188926 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jmqtm" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.204122 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x726\" (UniqueName: \"kubernetes.io/projected/0ddd71c6-2966-406d-af9e-122263ed9610-kube-api-access-4x726\") pod \"keystone-4694-account-create-update-chqcv\" (UID: \"0ddd71c6-2966-406d-af9e-122263ed9610\") " pod="openstack/keystone-4694-account-create-update-chqcv" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.204270 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.204434 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ddd71c6-2966-406d-af9e-122263ed9610-operator-scripts\") pod \"keystone-4694-account-create-update-chqcv\" (UID: \"0ddd71c6-2966-406d-af9e-122263ed9610\") " pod="openstack/keystone-4694-account-create-update-chqcv" Mar 19 16:59:12 crc kubenswrapper[4918]: E0319 16:59:12.204941 4918 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 16:59:12 crc kubenswrapper[4918]: E0319 16:59:12.204982 4918 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 16:59:12 crc kubenswrapper[4918]: E0319 16:59:12.205037 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift podName:e4118384-38ad-465d-a81e-62bf39cc6cec nodeName:}" failed. No retries permitted until 2026-03-19 16:59:20.205017304 +0000 UTC m=+1172.327216582 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift") pod "swift-storage-0" (UID: "e4118384-38ad-465d-a81e-62bf39cc6cec") : configmap "swift-ring-files" not found Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.205650 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ddd71c6-2966-406d-af9e-122263ed9610-operator-scripts\") pod \"keystone-4694-account-create-update-chqcv\" (UID: \"0ddd71c6-2966-406d-af9e-122263ed9610\") " pod="openstack/keystone-4694-account-create-update-chqcv" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.225989 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jmqtm"] Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.228128 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bp9zk" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.231643 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x726\" (UniqueName: \"kubernetes.io/projected/0ddd71c6-2966-406d-af9e-122263ed9610-kube-api-access-4x726\") pod \"keystone-4694-account-create-update-chqcv\" (UID: \"0ddd71c6-2966-406d-af9e-122263ed9610\") " pod="openstack/keystone-4694-account-create-update-chqcv" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.255226 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-eb42-account-create-update-h2lqq"] Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.257407 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb42-account-create-update-h2lqq" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.264863 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.277505 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-eb42-account-create-update-h2lqq"] Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.305623 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-operator-scripts\") pod \"placement-db-create-jmqtm\" (UID: \"e414f276-f48b-4efa-a0d5-c3bccaf6eb54\") " pod="openstack/placement-db-create-jmqtm" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.305862 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c38a283-f108-443b-a845-d378075a9881-operator-scripts\") pod \"placement-eb42-account-create-update-h2lqq\" (UID: \"9c38a283-f108-443b-a845-d378075a9881\") " pod="openstack/placement-eb42-account-create-update-h2lqq" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.305900 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpxpv\" (UniqueName: \"kubernetes.io/projected/9c38a283-f108-443b-a845-d378075a9881-kube-api-access-tpxpv\") pod \"placement-eb42-account-create-update-h2lqq\" (UID: \"9c38a283-f108-443b-a845-d378075a9881\") " pod="openstack/placement-eb42-account-create-update-h2lqq" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.306025 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv78c\" (UniqueName: \"kubernetes.io/projected/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-kube-api-access-cv78c\") pod \"placement-db-create-jmqtm\" (UID: \"e414f276-f48b-4efa-a0d5-c3bccaf6eb54\") " pod="openstack/placement-db-create-jmqtm" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.319196 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4694-account-create-update-chqcv" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.407372 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv78c\" (UniqueName: \"kubernetes.io/projected/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-kube-api-access-cv78c\") pod \"placement-db-create-jmqtm\" (UID: \"e414f276-f48b-4efa-a0d5-c3bccaf6eb54\") " pod="openstack/placement-db-create-jmqtm" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.407438 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-operator-scripts\") pod \"placement-db-create-jmqtm\" (UID: \"e414f276-f48b-4efa-a0d5-c3bccaf6eb54\") " pod="openstack/placement-db-create-jmqtm" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.407463 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c38a283-f108-443b-a845-d378075a9881-operator-scripts\") pod \"placement-eb42-account-create-update-h2lqq\" (UID: \"9c38a283-f108-443b-a845-d378075a9881\") " pod="openstack/placement-eb42-account-create-update-h2lqq" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.407532 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpxpv\" (UniqueName: \"kubernetes.io/projected/9c38a283-f108-443b-a845-d378075a9881-kube-api-access-tpxpv\") pod \"placement-eb42-account-create-update-h2lqq\" (UID: \"9c38a283-f108-443b-a845-d378075a9881\") " pod="openstack/placement-eb42-account-create-update-h2lqq" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.409076 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-operator-scripts\") pod \"placement-db-create-jmqtm\" (UID: \"e414f276-f48b-4efa-a0d5-c3bccaf6eb54\") " pod="openstack/placement-db-create-jmqtm" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.409647 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c38a283-f108-443b-a845-d378075a9881-operator-scripts\") pod \"placement-eb42-account-create-update-h2lqq\" (UID: \"9c38a283-f108-443b-a845-d378075a9881\") " pod="openstack/placement-eb42-account-create-update-h2lqq" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.422900 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpxpv\" (UniqueName: \"kubernetes.io/projected/9c38a283-f108-443b-a845-d378075a9881-kube-api-access-tpxpv\") pod \"placement-eb42-account-create-update-h2lqq\" (UID: \"9c38a283-f108-443b-a845-d378075a9881\") " pod="openstack/placement-eb42-account-create-update-h2lqq" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.429691 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv78c\" (UniqueName: \"kubernetes.io/projected/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-kube-api-access-cv78c\") pod \"placement-db-create-jmqtm\" (UID: \"e414f276-f48b-4efa-a0d5-c3bccaf6eb54\") " pod="openstack/placement-db-create-jmqtm" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.547305 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jmqtm" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.600188 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb42-account-create-update-h2lqq" Mar 19 16:59:12 crc kubenswrapper[4918]: I0319 16:59:12.664553 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"08c86067-0c7f-47a2-a2d4-e29ad43c539f","Type":"ContainerStarted","Data":"687090a707cac81961e45fc05ad69834a9a35c574db6a4f0262bb1914bccea92"} Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.543664 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.602990 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvjg9"] Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.603839 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" podUID="0ee23ae1-641f-43be-a41f-2065671c4534" containerName="dnsmasq-dns" containerID="cri-o://47ef84d4d6bb4ffb616f92bc828e1f72b86e6c18644fe2971ffde73668300bfb" gracePeriod=10 Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.845760 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b002-account-create-update-cq2hz" Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.855167 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6g8d" Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.869799 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mhhrx" Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.975363 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r94r\" (UniqueName: \"kubernetes.io/projected/313b84b1-8c9f-4a76-90dd-923d9ba8f469-kube-api-access-2r94r\") pod \"313b84b1-8c9f-4a76-90dd-923d9ba8f469\" (UID: \"313b84b1-8c9f-4a76-90dd-923d9ba8f469\") " Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.975459 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389ba586-4f0f-4be2-b731-903fc1cfe234-operator-scripts\") pod \"389ba586-4f0f-4be2-b731-903fc1cfe234\" (UID: \"389ba586-4f0f-4be2-b731-903fc1cfe234\") " Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.975555 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b0d9347-5fad-4ca6-8adc-0850f537067b-operator-scripts\") pod \"3b0d9347-5fad-4ca6-8adc-0850f537067b\" (UID: \"3b0d9347-5fad-4ca6-8adc-0850f537067b\") " Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.975621 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrmgf\" (UniqueName: \"kubernetes.io/projected/389ba586-4f0f-4be2-b731-903fc1cfe234-kube-api-access-hrmgf\") pod \"389ba586-4f0f-4be2-b731-903fc1cfe234\" (UID: \"389ba586-4f0f-4be2-b731-903fc1cfe234\") " Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.975673 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/313b84b1-8c9f-4a76-90dd-923d9ba8f469-operator-scripts\") pod \"313b84b1-8c9f-4a76-90dd-923d9ba8f469\" (UID: \"313b84b1-8c9f-4a76-90dd-923d9ba8f469\") " Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.975736 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwrqt\" (UniqueName: \"kubernetes.io/projected/3b0d9347-5fad-4ca6-8adc-0850f537067b-kube-api-access-nwrqt\") pod \"3b0d9347-5fad-4ca6-8adc-0850f537067b\" (UID: \"3b0d9347-5fad-4ca6-8adc-0850f537067b\") " Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.976041 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b0d9347-5fad-4ca6-8adc-0850f537067b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b0d9347-5fad-4ca6-8adc-0850f537067b" (UID: "3b0d9347-5fad-4ca6-8adc-0850f537067b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.976133 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/389ba586-4f0f-4be2-b731-903fc1cfe234-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "389ba586-4f0f-4be2-b731-903fc1cfe234" (UID: "389ba586-4f0f-4be2-b731-903fc1cfe234"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.976647 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313b84b1-8c9f-4a76-90dd-923d9ba8f469-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "313b84b1-8c9f-4a76-90dd-923d9ba8f469" (UID: "313b84b1-8c9f-4a76-90dd-923d9ba8f469"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.980733 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b0d9347-5fad-4ca6-8adc-0850f537067b-kube-api-access-nwrqt" (OuterVolumeSpecName: "kube-api-access-nwrqt") pod "3b0d9347-5fad-4ca6-8adc-0850f537067b" (UID: "3b0d9347-5fad-4ca6-8adc-0850f537067b"). InnerVolumeSpecName "kube-api-access-nwrqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.991275 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389ba586-4f0f-4be2-b731-903fc1cfe234-kube-api-access-hrmgf" (OuterVolumeSpecName: "kube-api-access-hrmgf") pod "389ba586-4f0f-4be2-b731-903fc1cfe234" (UID: "389ba586-4f0f-4be2-b731-903fc1cfe234"). InnerVolumeSpecName "kube-api-access-hrmgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:13 crc kubenswrapper[4918]: I0319 16:59:13.994313 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313b84b1-8c9f-4a76-90dd-923d9ba8f469-kube-api-access-2r94r" (OuterVolumeSpecName: "kube-api-access-2r94r") pod "313b84b1-8c9f-4a76-90dd-923d9ba8f469" (UID: "313b84b1-8c9f-4a76-90dd-923d9ba8f469"). InnerVolumeSpecName "kube-api-access-2r94r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.077910 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/313b84b1-8c9f-4a76-90dd-923d9ba8f469-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.077944 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwrqt\" (UniqueName: \"kubernetes.io/projected/3b0d9347-5fad-4ca6-8adc-0850f537067b-kube-api-access-nwrqt\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.077959 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r94r\" (UniqueName: \"kubernetes.io/projected/313b84b1-8c9f-4a76-90dd-923d9ba8f469-kube-api-access-2r94r\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.077970 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389ba586-4f0f-4be2-b731-903fc1cfe234-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.077982 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b0d9347-5fad-4ca6-8adc-0850f537067b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.077994 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrmgf\" (UniqueName: \"kubernetes.io/projected/389ba586-4f0f-4be2-b731-903fc1cfe234-kube-api-access-hrmgf\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.683298 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mhhrx" event={"ID":"3b0d9347-5fad-4ca6-8adc-0850f537067b","Type":"ContainerDied","Data":"67c82e0e9780766daa37536b8f48edfcf52470cc398389e50a0f685d8e9b996e"} Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.683342 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67c82e0e9780766daa37536b8f48edfcf52470cc398389e50a0f685d8e9b996e" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.683405 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mhhrx" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.687291 4918 generic.go:334] "Generic (PLEG): container finished" podID="0ee23ae1-641f-43be-a41f-2065671c4534" containerID="47ef84d4d6bb4ffb616f92bc828e1f72b86e6c18644fe2971ffde73668300bfb" exitCode=0 Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.687357 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" event={"ID":"0ee23ae1-641f-43be-a41f-2065671c4534","Type":"ContainerDied","Data":"47ef84d4d6bb4ffb616f92bc828e1f72b86e6c18644fe2971ffde73668300bfb"} Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.691412 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6g8d" event={"ID":"389ba586-4f0f-4be2-b731-903fc1cfe234","Type":"ContainerDied","Data":"696537eb5fb7547c197489342c622316f41ca25f0300d0288c3a4b2946060b32"} Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.691463 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="696537eb5fb7547c197489342c622316f41ca25f0300d0288c3a4b2946060b32" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.691419 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6g8d" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.694032 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b002-account-create-update-cq2hz" event={"ID":"313b84b1-8c9f-4a76-90dd-923d9ba8f469","Type":"ContainerDied","Data":"ea95d3709536136884cf74138bc93e567948fd7bd72c6d51272c20b93558d79b"} Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.694067 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea95d3709536136884cf74138bc93e567948fd7bd72c6d51272c20b93558d79b" Mar 19 16:59:14 crc kubenswrapper[4918]: I0319 16:59:14.694068 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b002-account-create-update-cq2hz" Mar 19 16:59:15 crc kubenswrapper[4918]: I0319 16:59:15.719475 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nmm4p" event={"ID":"7b21f92d-7895-46c0-a66d-9e0aedb15e72","Type":"ContainerStarted","Data":"f10d0a2c613c2314ddafab85c30b1a84abeb77cc6374e13db7d6e668e4469c49"} Mar 19 16:59:15 crc kubenswrapper[4918]: I0319 16:59:15.750712 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-nmm4p" podStartSLOduration=2.218258598 podStartE2EDuration="7.750693448s" podCreationTimestamp="2026-03-19 16:59:08 +0000 UTC" firstStartedPulling="2026-03-19 16:59:09.806546866 +0000 UTC m=+1161.928746114" lastFinishedPulling="2026-03-19 16:59:15.338981726 +0000 UTC m=+1167.461180964" observedRunningTime="2026-03-19 16:59:15.740739674 +0000 UTC m=+1167.862938922" watchObservedRunningTime="2026-03-19 16:59:15.750693448 +0000 UTC m=+1167.872892696" Mar 19 16:59:15 crc kubenswrapper[4918]: I0319 16:59:15.880321 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jmqtm"] Mar 19 16:59:15 crc kubenswrapper[4918]: I0319 16:59:15.896841 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4694-account-create-update-chqcv"] Mar 19 16:59:15 crc kubenswrapper[4918]: I0319 16:59:15.953815 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bp9zk"] Mar 19 16:59:15 crc kubenswrapper[4918]: I0319 16:59:15.966526 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-eb42-account-create-update-h2lqq"] Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.431738 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-v6c2z"] Mar 19 16:59:16 crc kubenswrapper[4918]: E0319 16:59:16.432366 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313b84b1-8c9f-4a76-90dd-923d9ba8f469" containerName="mariadb-account-create-update" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.432383 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="313b84b1-8c9f-4a76-90dd-923d9ba8f469" containerName="mariadb-account-create-update" Mar 19 16:59:16 crc kubenswrapper[4918]: E0319 16:59:16.432397 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389ba586-4f0f-4be2-b731-903fc1cfe234" containerName="mariadb-account-create-update" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.432404 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="389ba586-4f0f-4be2-b731-903fc1cfe234" containerName="mariadb-account-create-update" Mar 19 16:59:16 crc kubenswrapper[4918]: E0319 16:59:16.432421 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b0d9347-5fad-4ca6-8adc-0850f537067b" containerName="mariadb-database-create" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.432427 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b0d9347-5fad-4ca6-8adc-0850f537067b" containerName="mariadb-database-create" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.432643 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b0d9347-5fad-4ca6-8adc-0850f537067b" containerName="mariadb-database-create" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.432661 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="389ba586-4f0f-4be2-b731-903fc1cfe234" containerName="mariadb-account-create-update" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.432683 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="313b84b1-8c9f-4a76-90dd-923d9ba8f469" containerName="mariadb-account-create-update" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.433321 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.435855 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4zksx" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.437271 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.463107 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-v6c2z"] Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.555312 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-db-sync-config-data\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.555373 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-combined-ca-bundle\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.555402 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpg2m\" (UniqueName: \"kubernetes.io/projected/bc559952-1f04-4a21-8415-c9c613c5b4d4-kube-api-access-rpg2m\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.555432 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-config-data\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.662808 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-db-sync-config-data\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.662889 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-combined-ca-bundle\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.662936 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpg2m\" (UniqueName: \"kubernetes.io/projected/bc559952-1f04-4a21-8415-c9c613c5b4d4-kube-api-access-rpg2m\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.662974 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-config-data\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.668588 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-combined-ca-bundle\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.669033 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-db-sync-config-data\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.680916 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-config-data\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.681591 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpg2m\" (UniqueName: \"kubernetes.io/projected/bc559952-1f04-4a21-8415-c9c613c5b4d4-kube-api-access-rpg2m\") pod \"glance-db-sync-v6c2z\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.764781 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:16 crc kubenswrapper[4918]: W0319 16:59:16.841646 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode414f276_f48b_4efa_a0d5_c3bccaf6eb54.slice/crio-c3b5628e9d5d40e6c861aa6c371c1449bb04ba9068b1a6a6e3fda29bced08301 WatchSource:0}: Error finding container c3b5628e9d5d40e6c861aa6c371c1449bb04ba9068b1a6a6e3fda29bced08301: Status 404 returned error can't find the container with id c3b5628e9d5d40e6c861aa6c371c1449bb04ba9068b1a6a6e3fda29bced08301 Mar 19 16:59:16 crc kubenswrapper[4918]: I0319 16:59:16.926964 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.102888 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-dns-svc\") pod \"0ee23ae1-641f-43be-a41f-2065671c4534\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.103151 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw72j\" (UniqueName: \"kubernetes.io/projected/0ee23ae1-641f-43be-a41f-2065671c4534-kube-api-access-fw72j\") pod \"0ee23ae1-641f-43be-a41f-2065671c4534\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.103172 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-config\") pod \"0ee23ae1-641f-43be-a41f-2065671c4534\" (UID: \"0ee23ae1-641f-43be-a41f-2065671c4534\") " Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.118699 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee23ae1-641f-43be-a41f-2065671c4534-kube-api-access-fw72j" (OuterVolumeSpecName: "kube-api-access-fw72j") pod "0ee23ae1-641f-43be-a41f-2065671c4534" (UID: "0ee23ae1-641f-43be-a41f-2065671c4534"). InnerVolumeSpecName "kube-api-access-fw72j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.205389 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw72j\" (UniqueName: \"kubernetes.io/projected/0ee23ae1-641f-43be-a41f-2065671c4534-kube-api-access-fw72j\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.386315 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-config" (OuterVolumeSpecName: "config") pod "0ee23ae1-641f-43be-a41f-2065671c4534" (UID: "0ee23ae1-641f-43be-a41f-2065671c4534"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.387139 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ee23ae1-641f-43be-a41f-2065671c4534" (UID: "0ee23ae1-641f-43be-a41f-2065671c4534"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.409389 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.409416 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ee23ae1-641f-43be-a41f-2065671c4534-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:17 crc kubenswrapper[4918]: W0319 16:59:17.539098 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc559952_1f04_4a21_8415_c9c613c5b4d4.slice/crio-589062f570b9ecda37eae83a531e5b6ddaa67bb4834946beb92a92f8875f15c8 WatchSource:0}: Error finding container 589062f570b9ecda37eae83a531e5b6ddaa67bb4834946beb92a92f8875f15c8: Status 404 returned error can't find the container with id 589062f570b9ecda37eae83a531e5b6ddaa67bb4834946beb92a92f8875f15c8 Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.543602 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-v6c2z"] Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.750102 4918 generic.go:334] "Generic (PLEG): container finished" podID="9c38a283-f108-443b-a845-d378075a9881" containerID="8a2ea17b5d7a30f6cd1c011eb32615c38a924371449beec108b262bfdbbc43d5" exitCode=0 Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.750177 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb42-account-create-update-h2lqq" event={"ID":"9c38a283-f108-443b-a845-d378075a9881","Type":"ContainerDied","Data":"8a2ea17b5d7a30f6cd1c011eb32615c38a924371449beec108b262bfdbbc43d5"} Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.750499 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb42-account-create-update-h2lqq" event={"ID":"9c38a283-f108-443b-a845-d378075a9881","Type":"ContainerStarted","Data":"bfb37bc32333c4689dd31d5e0fd0f103eaa3fc66b3ed3c82d12d01967dba19e8"} Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.755388 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"08c86067-0c7f-47a2-a2d4-e29ad43c539f","Type":"ContainerStarted","Data":"4dd4082592708c39235b41bad06f5146e140100b57df405a486b3c7197dca754"} Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.760448 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" event={"ID":"0ee23ae1-641f-43be-a41f-2065671c4534","Type":"ContainerDied","Data":"6e0b98fde94bac296fa8100f01da7b9a82357f7efbb85fb6697f1cfc75c95ecd"} Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.760495 4918 scope.go:117] "RemoveContainer" containerID="47ef84d4d6bb4ffb616f92bc828e1f72b86e6c18644fe2971ffde73668300bfb" Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.760688 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvjg9" Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.777427 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-v6c2z" event={"ID":"bc559952-1f04-4a21-8415-c9c613c5b4d4","Type":"ContainerStarted","Data":"589062f570b9ecda37eae83a531e5b6ddaa67bb4834946beb92a92f8875f15c8"} Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.780322 4918 generic.go:334] "Generic (PLEG): container finished" podID="0ddd71c6-2966-406d-af9e-122263ed9610" containerID="e3a6db7b1c305254d513e876d3a9924e7b884647e2f7f3abc7963f0a5e9d2709" exitCode=0 Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.780385 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4694-account-create-update-chqcv" event={"ID":"0ddd71c6-2966-406d-af9e-122263ed9610","Type":"ContainerDied","Data":"e3a6db7b1c305254d513e876d3a9924e7b884647e2f7f3abc7963f0a5e9d2709"} Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.780408 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4694-account-create-update-chqcv" event={"ID":"0ddd71c6-2966-406d-af9e-122263ed9610","Type":"ContainerStarted","Data":"a6c5bc64fb5f17c8e3e545680ba174b02f8424316ec38896c902780a9fd5076f"} Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.781856 4918 generic.go:334] "Generic (PLEG): container finished" podID="1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a" containerID="446e68559bf5da44522ac487d278a52433ae0ceb83d617b69fd8206585f93b09" exitCode=0 Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.781910 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bp9zk" event={"ID":"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a","Type":"ContainerDied","Data":"446e68559bf5da44522ac487d278a52433ae0ceb83d617b69fd8206585f93b09"} Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.781929 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bp9zk" event={"ID":"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a","Type":"ContainerStarted","Data":"678281e73ad5eb73a87d5f295a74b9524da5d51b7ed3cc7f5c89ee6a4c27bc14"} Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.783821 4918 generic.go:334] "Generic (PLEG): container finished" podID="e414f276-f48b-4efa-a0d5-c3bccaf6eb54" containerID="0f2ac0a926325347309616ffa21ea87676ff2f84cf06236ad8a38659f01962f7" exitCode=0 Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.783877 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jmqtm" event={"ID":"e414f276-f48b-4efa-a0d5-c3bccaf6eb54","Type":"ContainerDied","Data":"0f2ac0a926325347309616ffa21ea87676ff2f84cf06236ad8a38659f01962f7"} Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.783905 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jmqtm" event={"ID":"e414f276-f48b-4efa-a0d5-c3bccaf6eb54","Type":"ContainerStarted","Data":"c3b5628e9d5d40e6c861aa6c371c1449bb04ba9068b1a6a6e3fda29bced08301"} Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.797610 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=11.590973907 podStartE2EDuration="54.797590659s" podCreationTimestamp="2026-03-19 16:58:23 +0000 UTC" firstStartedPulling="2026-03-19 16:58:33.803035989 +0000 UTC m=+1125.925235237" lastFinishedPulling="2026-03-19 16:59:17.009652741 +0000 UTC m=+1169.131851989" observedRunningTime="2026-03-19 16:59:17.789307831 +0000 UTC m=+1169.911507079" watchObservedRunningTime="2026-03-19 16:59:17.797590659 +0000 UTC m=+1169.919789917" Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.829422 4918 scope.go:117] "RemoveContainer" containerID="ef48521b8e1c732fc9a73a10fbde0aee0c7690c02fd3dafc004151470c84103d" Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.871521 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvjg9"] Mar 19 16:59:17 crc kubenswrapper[4918]: I0319 16:59:17.881146 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvjg9"] Mar 19 16:59:18 crc kubenswrapper[4918]: I0319 16:59:18.603105 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee23ae1-641f-43be-a41f-2065671c4534" path="/var/lib/kubelet/pods/0ee23ae1-641f-43be-a41f-2065671c4534/volumes" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.229701 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jmqtm" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.269635 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-z6g8d"] Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.294234 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-z6g8d"] Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.350997 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-operator-scripts\") pod \"e414f276-f48b-4efa-a0d5-c3bccaf6eb54\" (UID: \"e414f276-f48b-4efa-a0d5-c3bccaf6eb54\") " Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.351091 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv78c\" (UniqueName: \"kubernetes.io/projected/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-kube-api-access-cv78c\") pod \"e414f276-f48b-4efa-a0d5-c3bccaf6eb54\" (UID: \"e414f276-f48b-4efa-a0d5-c3bccaf6eb54\") " Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.352039 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e414f276-f48b-4efa-a0d5-c3bccaf6eb54" (UID: "e414f276-f48b-4efa-a0d5-c3bccaf6eb54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.395061 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.409386 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-kube-api-access-cv78c" (OuterVolumeSpecName: "kube-api-access-cv78c") pod "e414f276-f48b-4efa-a0d5-c3bccaf6eb54" (UID: "e414f276-f48b-4efa-a0d5-c3bccaf6eb54"). InnerVolumeSpecName "kube-api-access-cv78c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.497230 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv78c\" (UniqueName: \"kubernetes.io/projected/e414f276-f48b-4efa-a0d5-c3bccaf6eb54-kube-api-access-cv78c\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.521645 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb42-account-create-update-h2lqq" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.530147 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4694-account-create-update-chqcv" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.543983 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bp9zk" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.598452 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpxpv\" (UniqueName: \"kubernetes.io/projected/9c38a283-f108-443b-a845-d378075a9881-kube-api-access-tpxpv\") pod \"9c38a283-f108-443b-a845-d378075a9881\" (UID: \"9c38a283-f108-443b-a845-d378075a9881\") " Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.598572 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ddd71c6-2966-406d-af9e-122263ed9610-operator-scripts\") pod \"0ddd71c6-2966-406d-af9e-122263ed9610\" (UID: \"0ddd71c6-2966-406d-af9e-122263ed9610\") " Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.598665 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c38a283-f108-443b-a845-d378075a9881-operator-scripts\") pod \"9c38a283-f108-443b-a845-d378075a9881\" (UID: \"9c38a283-f108-443b-a845-d378075a9881\") " Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.598689 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-operator-scripts\") pod \"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a\" (UID: \"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a\") " Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.598731 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x726\" (UniqueName: \"kubernetes.io/projected/0ddd71c6-2966-406d-af9e-122263ed9610-kube-api-access-4x726\") pod \"0ddd71c6-2966-406d-af9e-122263ed9610\" (UID: \"0ddd71c6-2966-406d-af9e-122263ed9610\") " Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.598759 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frls2\" (UniqueName: \"kubernetes.io/projected/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-kube-api-access-frls2\") pod \"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a\" (UID: \"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a\") " Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.599014 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ddd71c6-2966-406d-af9e-122263ed9610-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ddd71c6-2966-406d-af9e-122263ed9610" (UID: "0ddd71c6-2966-406d-af9e-122263ed9610"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.599209 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ddd71c6-2966-406d-af9e-122263ed9610-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.599925 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c38a283-f108-443b-a845-d378075a9881-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c38a283-f108-443b-a845-d378075a9881" (UID: "9c38a283-f108-443b-a845-d378075a9881"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.600077 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a" (UID: "1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.602991 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ddd71c6-2966-406d-af9e-122263ed9610-kube-api-access-4x726" (OuterVolumeSpecName: "kube-api-access-4x726") pod "0ddd71c6-2966-406d-af9e-122263ed9610" (UID: "0ddd71c6-2966-406d-af9e-122263ed9610"). InnerVolumeSpecName "kube-api-access-4x726". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.603192 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-kube-api-access-frls2" (OuterVolumeSpecName: "kube-api-access-frls2") pod "1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a" (UID: "1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a"). InnerVolumeSpecName "kube-api-access-frls2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.603322 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c38a283-f108-443b-a845-d378075a9881-kube-api-access-tpxpv" (OuterVolumeSpecName: "kube-api-access-tpxpv") pod "9c38a283-f108-443b-a845-d378075a9881" (UID: "9c38a283-f108-443b-a845-d378075a9881"). InnerVolumeSpecName "kube-api-access-tpxpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.700044 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c38a283-f108-443b-a845-d378075a9881-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.700070 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.700079 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x726\" (UniqueName: \"kubernetes.io/projected/0ddd71c6-2966-406d-af9e-122263ed9610-kube-api-access-4x726\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.700088 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frls2\" (UniqueName: \"kubernetes.io/projected/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a-kube-api-access-frls2\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.700097 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpxpv\" (UniqueName: \"kubernetes.io/projected/9c38a283-f108-443b-a845-d378075a9881-kube-api-access-tpxpv\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.842042 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4694-account-create-update-chqcv" event={"ID":"0ddd71c6-2966-406d-af9e-122263ed9610","Type":"ContainerDied","Data":"a6c5bc64fb5f17c8e3e545680ba174b02f8424316ec38896c902780a9fd5076f"} Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.842094 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6c5bc64fb5f17c8e3e545680ba174b02f8424316ec38896c902780a9fd5076f" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.842170 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4694-account-create-update-chqcv" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.846454 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bp9zk" event={"ID":"1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a","Type":"ContainerDied","Data":"678281e73ad5eb73a87d5f295a74b9524da5d51b7ed3cc7f5c89ee6a4c27bc14"} Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.846550 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="678281e73ad5eb73a87d5f295a74b9524da5d51b7ed3cc7f5c89ee6a4c27bc14" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.846621 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bp9zk" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.848917 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jmqtm" event={"ID":"e414f276-f48b-4efa-a0d5-c3bccaf6eb54","Type":"ContainerDied","Data":"c3b5628e9d5d40e6c861aa6c371c1449bb04ba9068b1a6a6e3fda29bced08301"} Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.848958 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3b5628e9d5d40e6c861aa6c371c1449bb04ba9068b1a6a6e3fda29bced08301" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.848977 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jmqtm" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.850878 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb42-account-create-update-h2lqq" event={"ID":"9c38a283-f108-443b-a845-d378075a9881","Type":"ContainerDied","Data":"bfb37bc32333c4689dd31d5e0fd0f103eaa3fc66b3ed3c82d12d01967dba19e8"} Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.850908 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfb37bc32333c4689dd31d5e0fd0f103eaa3fc66b3ed3c82d12d01967dba19e8" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.850961 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb42-account-create-update-h2lqq" Mar 19 16:59:19 crc kubenswrapper[4918]: I0319 16:59:19.908818 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:20 crc kubenswrapper[4918]: I0319 16:59:20.209821 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:20 crc kubenswrapper[4918]: E0319 16:59:20.210025 4918 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 16:59:20 crc kubenswrapper[4918]: E0319 16:59:20.210041 4918 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 16:59:20 crc kubenswrapper[4918]: E0319 16:59:20.210097 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift podName:e4118384-38ad-465d-a81e-62bf39cc6cec nodeName:}" failed. No retries permitted until 2026-03-19 16:59:36.210079522 +0000 UTC m=+1188.332278770 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift") pod "swift-storage-0" (UID: "e4118384-38ad-465d-a81e-62bf39cc6cec") : configmap "swift-ring-files" not found Mar 19 16:59:20 crc kubenswrapper[4918]: I0319 16:59:20.598662 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389ba586-4f0f-4be2-b731-903fc1cfe234" path="/var/lib/kubelet/pods/389ba586-4f0f-4be2-b731-903fc1cfe234/volumes" Mar 19 16:59:21 crc kubenswrapper[4918]: I0319 16:59:21.103504 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 19 16:59:21 crc kubenswrapper[4918]: I0319 16:59:21.500427 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="142e9778-542e-491b-95f2-8a63e76c4271" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 16:59:21 crc kubenswrapper[4918]: I0319 16:59:21.967323 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4g569" podUID="ddfeeb53-dd69-430f-9460-fa20627d4d26" containerName="ovn-controller" probeResult="failure" output=< Mar 19 16:59:21 crc kubenswrapper[4918]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 16:59:21 crc kubenswrapper[4918]: > Mar 19 16:59:22 crc kubenswrapper[4918]: I0319 16:59:22.877035 4918 generic.go:334] "Generic (PLEG): container finished" podID="849ee593-de3d-4343-8a63-3ca581fbbaaf" containerID="8dcfdc5ec03c9e57cd90b1af5c63e724835e77f75667c64ea36e6d4de0de6025" exitCode=0 Mar 19 16:59:22 crc kubenswrapper[4918]: I0319 16:59:22.877121 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"849ee593-de3d-4343-8a63-3ca581fbbaaf","Type":"ContainerDied","Data":"8dcfdc5ec03c9e57cd90b1af5c63e724835e77f75667c64ea36e6d4de0de6025"} Mar 19 16:59:22 crc kubenswrapper[4918]: I0319 16:59:22.879469 4918 generic.go:334] "Generic (PLEG): container finished" podID="049bc86c-2172-4f37-b7b4-20e546c273e4" containerID="3def5dec798f2b559b2334d375c417119107e526427d467805fc3b41126d7aca" exitCode=0 Mar 19 16:59:22 crc kubenswrapper[4918]: I0319 16:59:22.879513 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"049bc86c-2172-4f37-b7b4-20e546c273e4","Type":"ContainerDied","Data":"3def5dec798f2b559b2334d375c417119107e526427d467805fc3b41126d7aca"} Mar 19 16:59:23 crc kubenswrapper[4918]: I0319 16:59:23.887907 4918 generic.go:334] "Generic (PLEG): container finished" podID="7b21f92d-7895-46c0-a66d-9e0aedb15e72" containerID="f10d0a2c613c2314ddafab85c30b1a84abeb77cc6374e13db7d6e668e4469c49" exitCode=0 Mar 19 16:59:23 crc kubenswrapper[4918]: I0319 16:59:23.888002 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nmm4p" event={"ID":"7b21f92d-7895-46c0-a66d-9e0aedb15e72","Type":"ContainerDied","Data":"f10d0a2c613c2314ddafab85c30b1a84abeb77cc6374e13db7d6e668e4469c49"} Mar 19 16:59:23 crc kubenswrapper[4918]: I0319 16:59:23.891841 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"049bc86c-2172-4f37-b7b4-20e546c273e4","Type":"ContainerStarted","Data":"1590419da13d0aa0b5f985aa7ddf4cf89cfa843d2aa16978885275772527f724"} Mar 19 16:59:23 crc kubenswrapper[4918]: I0319 16:59:23.892050 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 16:59:23 crc kubenswrapper[4918]: I0319 16:59:23.901005 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"849ee593-de3d-4343-8a63-3ca581fbbaaf","Type":"ContainerStarted","Data":"8991ef3b76ab8f08cdc76be810108c9a6ee39bdf4e6ad0cb1d96a3c67a5c362a"} Mar 19 16:59:23 crc kubenswrapper[4918]: I0319 16:59:23.901218 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:59:23 crc kubenswrapper[4918]: I0319 16:59:23.959739 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.113058629 podStartE2EDuration="1m7.95972399s" podCreationTimestamp="2026-03-19 16:58:16 +0000 UTC" firstStartedPulling="2026-03-19 16:58:33.255765776 +0000 UTC m=+1125.377965014" lastFinishedPulling="2026-03-19 16:58:48.102431127 +0000 UTC m=+1140.224630375" observedRunningTime="2026-03-19 16:59:23.957394166 +0000 UTC m=+1176.079593414" watchObservedRunningTime="2026-03-19 16:59:23.95972399 +0000 UTC m=+1176.081923238" Mar 19 16:59:23 crc kubenswrapper[4918]: I0319 16:59:23.963944 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.498641498 podStartE2EDuration="1m7.963925486s" podCreationTimestamp="2026-03-19 16:58:16 +0000 UTC" firstStartedPulling="2026-03-19 16:58:33.063765791 +0000 UTC m=+1125.185965039" lastFinishedPulling="2026-03-19 16:58:48.529049769 +0000 UTC m=+1140.651249027" observedRunningTime="2026-03-19 16:59:23.939792432 +0000 UTC m=+1176.061991680" watchObservedRunningTime="2026-03-19 16:59:23.963925486 +0000 UTC m=+1176.086124734" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.265827 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bc7bt"] Mar 19 16:59:24 crc kubenswrapper[4918]: E0319 16:59:24.266621 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e414f276-f48b-4efa-a0d5-c3bccaf6eb54" containerName="mariadb-database-create" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.266647 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e414f276-f48b-4efa-a0d5-c3bccaf6eb54" containerName="mariadb-database-create" Mar 19 16:59:24 crc kubenswrapper[4918]: E0319 16:59:24.266666 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee23ae1-641f-43be-a41f-2065671c4534" containerName="init" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.266674 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee23ae1-641f-43be-a41f-2065671c4534" containerName="init" Mar 19 16:59:24 crc kubenswrapper[4918]: E0319 16:59:24.266687 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a" containerName="mariadb-database-create" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.266696 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a" containerName="mariadb-database-create" Mar 19 16:59:24 crc kubenswrapper[4918]: E0319 16:59:24.266722 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee23ae1-641f-43be-a41f-2065671c4534" containerName="dnsmasq-dns" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.266731 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee23ae1-641f-43be-a41f-2065671c4534" containerName="dnsmasq-dns" Mar 19 16:59:24 crc kubenswrapper[4918]: E0319 16:59:24.266752 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ddd71c6-2966-406d-af9e-122263ed9610" containerName="mariadb-account-create-update" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.266761 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ddd71c6-2966-406d-af9e-122263ed9610" containerName="mariadb-account-create-update" Mar 19 16:59:24 crc kubenswrapper[4918]: E0319 16:59:24.266777 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c38a283-f108-443b-a845-d378075a9881" containerName="mariadb-account-create-update" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.266785 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c38a283-f108-443b-a845-d378075a9881" containerName="mariadb-account-create-update" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.267008 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a" containerName="mariadb-database-create" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.267025 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c38a283-f108-443b-a845-d378075a9881" containerName="mariadb-account-create-update" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.267043 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee23ae1-641f-43be-a41f-2065671c4534" containerName="dnsmasq-dns" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.267055 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e414f276-f48b-4efa-a0d5-c3bccaf6eb54" containerName="mariadb-database-create" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.267067 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ddd71c6-2966-406d-af9e-122263ed9610" containerName="mariadb-account-create-update" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.267865 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bc7bt" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.270290 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.273583 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bc7bt"] Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.397342 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9377958-db94-4c7d-bc29-37ca8135ba07-operator-scripts\") pod \"root-account-create-update-bc7bt\" (UID: \"f9377958-db94-4c7d-bc29-37ca8135ba07\") " pod="openstack/root-account-create-update-bc7bt" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.397471 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvfk4\" (UniqueName: \"kubernetes.io/projected/f9377958-db94-4c7d-bc29-37ca8135ba07-kube-api-access-jvfk4\") pod \"root-account-create-update-bc7bt\" (UID: \"f9377958-db94-4c7d-bc29-37ca8135ba07\") " pod="openstack/root-account-create-update-bc7bt" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.498993 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvfk4\" (UniqueName: \"kubernetes.io/projected/f9377958-db94-4c7d-bc29-37ca8135ba07-kube-api-access-jvfk4\") pod \"root-account-create-update-bc7bt\" (UID: \"f9377958-db94-4c7d-bc29-37ca8135ba07\") " pod="openstack/root-account-create-update-bc7bt" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.499148 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9377958-db94-4c7d-bc29-37ca8135ba07-operator-scripts\") pod \"root-account-create-update-bc7bt\" (UID: \"f9377958-db94-4c7d-bc29-37ca8135ba07\") " pod="openstack/root-account-create-update-bc7bt" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.499878 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9377958-db94-4c7d-bc29-37ca8135ba07-operator-scripts\") pod \"root-account-create-update-bc7bt\" (UID: \"f9377958-db94-4c7d-bc29-37ca8135ba07\") " pod="openstack/root-account-create-update-bc7bt" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.518294 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvfk4\" (UniqueName: \"kubernetes.io/projected/f9377958-db94-4c7d-bc29-37ca8135ba07-kube-api-access-jvfk4\") pod \"root-account-create-update-bc7bt\" (UID: \"f9377958-db94-4c7d-bc29-37ca8135ba07\") " pod="openstack/root-account-create-update-bc7bt" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.586118 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bc7bt" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.908773 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:24 crc kubenswrapper[4918]: I0319 16:59:24.914606 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:25 crc kubenswrapper[4918]: I0319 16:59:25.934797 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:26 crc kubenswrapper[4918]: I0319 16:59:26.958579 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4g569" podUID="ddfeeb53-dd69-430f-9460-fa20627d4d26" containerName="ovn-controller" probeResult="failure" output=< Mar 19 16:59:26 crc kubenswrapper[4918]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 16:59:26 crc kubenswrapper[4918]: > Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.003767 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.041033 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kt2zs" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.247298 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4g569-config-w4t6w"] Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.248330 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.250610 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.261875 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4g569-config-w4t6w"] Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.270796 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-additional-scripts\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.270854 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.270899 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-scripts\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.270985 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-log-ovn\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.271037 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run-ovn\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.271066 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl9pq\" (UniqueName: \"kubernetes.io/projected/3e1300b0-1d38-47c5-a702-676e712859bd-kube-api-access-bl9pq\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.373268 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-additional-scripts\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.373328 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.373378 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-scripts\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.373471 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-log-ovn\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.373547 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run-ovn\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.373592 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl9pq\" (UniqueName: \"kubernetes.io/projected/3e1300b0-1d38-47c5-a702-676e712859bd-kube-api-access-bl9pq\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.373719 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.373787 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run-ovn\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.373873 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-log-ovn\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.374195 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-additional-scripts\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.376360 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-scripts\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.391938 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl9pq\" (UniqueName: \"kubernetes.io/projected/3e1300b0-1d38-47c5-a702-676e712859bd-kube-api-access-bl9pq\") pod \"ovn-controller-4g569-config-w4t6w\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:27 crc kubenswrapper[4918]: I0319 16:59:27.615842 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:28 crc kubenswrapper[4918]: I0319 16:59:28.212058 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:59:28 crc kubenswrapper[4918]: I0319 16:59:28.212106 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:59:29 crc kubenswrapper[4918]: I0319 16:59:29.534177 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 16:59:30 crc kubenswrapper[4918]: I0319 16:59:29.534496 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="prometheus" containerID="cri-o://b8e1cfe90949749e2b18f25ef2bbcf1099b0653e409299865e87c877b9f7c64a" gracePeriod=600 Mar 19 16:59:30 crc kubenswrapper[4918]: I0319 16:59:29.534619 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="thanos-sidecar" containerID="cri-o://4dd4082592708c39235b41bad06f5146e140100b57df405a486b3c7197dca754" gracePeriod=600 Mar 19 16:59:30 crc kubenswrapper[4918]: I0319 16:59:29.534619 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="config-reloader" containerID="cri-o://687090a707cac81961e45fc05ad69834a9a35c574db6a4f0262bb1914bccea92" gracePeriod=600 Mar 19 16:59:30 crc kubenswrapper[4918]: I0319 16:59:29.909610 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.117:9090/-/ready\": dial tcp 10.217.0.117:9090: connect: connection refused" Mar 19 16:59:30 crc kubenswrapper[4918]: I0319 16:59:29.974226 4918 generic.go:334] "Generic (PLEG): container finished" podID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerID="4dd4082592708c39235b41bad06f5146e140100b57df405a486b3c7197dca754" exitCode=0 Mar 19 16:59:30 crc kubenswrapper[4918]: I0319 16:59:29.974272 4918 generic.go:334] "Generic (PLEG): container finished" podID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerID="687090a707cac81961e45fc05ad69834a9a35c574db6a4f0262bb1914bccea92" exitCode=0 Mar 19 16:59:30 crc kubenswrapper[4918]: I0319 16:59:29.974282 4918 generic.go:334] "Generic (PLEG): container finished" podID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerID="b8e1cfe90949749e2b18f25ef2bbcf1099b0653e409299865e87c877b9f7c64a" exitCode=0 Mar 19 16:59:30 crc kubenswrapper[4918]: I0319 16:59:29.974276 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"08c86067-0c7f-47a2-a2d4-e29ad43c539f","Type":"ContainerDied","Data":"4dd4082592708c39235b41bad06f5146e140100b57df405a486b3c7197dca754"} Mar 19 16:59:30 crc kubenswrapper[4918]: I0319 16:59:29.974320 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"08c86067-0c7f-47a2-a2d4-e29ad43c539f","Type":"ContainerDied","Data":"687090a707cac81961e45fc05ad69834a9a35c574db6a4f0262bb1914bccea92"} Mar 19 16:59:30 crc kubenswrapper[4918]: I0319 16:59:29.974335 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"08c86067-0c7f-47a2-a2d4-e29ad43c539f","Type":"ContainerDied","Data":"b8e1cfe90949749e2b18f25ef2bbcf1099b0653e409299865e87c877b9f7c64a"} Mar 19 16:59:31 crc kubenswrapper[4918]: I0319 16:59:31.494761 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="142e9778-542e-491b-95f2-8a63e76c4271" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 16:59:31 crc kubenswrapper[4918]: I0319 16:59:31.976883 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4g569" podUID="ddfeeb53-dd69-430f-9460-fa20627d4d26" containerName="ovn-controller" probeResult="failure" output=< Mar 19 16:59:31 crc kubenswrapper[4918]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 16:59:31 crc kubenswrapper[4918]: > Mar 19 16:59:31 crc kubenswrapper[4918]: I0319 16:59:31.994066 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nmm4p" event={"ID":"7b21f92d-7895-46c0-a66d-9e0aedb15e72","Type":"ContainerDied","Data":"5550d4cc16aa1357b72a01439f179649afedbcce867d813b3fcda3d9b71207d2"} Mar 19 16:59:31 crc kubenswrapper[4918]: I0319 16:59:31.994118 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5550d4cc16aa1357b72a01439f179649afedbcce867d813b3fcda3d9b71207d2" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.084243 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.263653 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b21f92d-7895-46c0-a66d-9e0aedb15e72-etc-swift\") pod \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.263698 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-combined-ca-bundle\") pod \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.263727 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-dispersionconf\") pod \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.263786 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-scripts\") pod \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.263827 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nrb4\" (UniqueName: \"kubernetes.io/projected/7b21f92d-7895-46c0-a66d-9e0aedb15e72-kube-api-access-6nrb4\") pod \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.263867 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-ring-data-devices\") pod \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.263904 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-swiftconf\") pod \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\" (UID: \"7b21f92d-7895-46c0-a66d-9e0aedb15e72\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.266220 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b21f92d-7895-46c0-a66d-9e0aedb15e72-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7b21f92d-7895-46c0-a66d-9e0aedb15e72" (UID: "7b21f92d-7895-46c0-a66d-9e0aedb15e72"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.267067 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7b21f92d-7895-46c0-a66d-9e0aedb15e72" (UID: "7b21f92d-7895-46c0-a66d-9e0aedb15e72"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.276950 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b21f92d-7895-46c0-a66d-9e0aedb15e72-kube-api-access-6nrb4" (OuterVolumeSpecName: "kube-api-access-6nrb4") pod "7b21f92d-7895-46c0-a66d-9e0aedb15e72" (UID: "7b21f92d-7895-46c0-a66d-9e0aedb15e72"). InnerVolumeSpecName "kube-api-access-6nrb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.279682 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7b21f92d-7895-46c0-a66d-9e0aedb15e72" (UID: "7b21f92d-7895-46c0-a66d-9e0aedb15e72"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.298989 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-scripts" (OuterVolumeSpecName: "scripts") pod "7b21f92d-7895-46c0-a66d-9e0aedb15e72" (UID: "7b21f92d-7895-46c0-a66d-9e0aedb15e72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.316148 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7b21f92d-7895-46c0-a66d-9e0aedb15e72" (UID: "7b21f92d-7895-46c0-a66d-9e0aedb15e72"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.344746 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b21f92d-7895-46c0-a66d-9e0aedb15e72" (UID: "7b21f92d-7895-46c0-a66d-9e0aedb15e72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.344893 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.365753 4918 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b21f92d-7895-46c0-a66d-9e0aedb15e72-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.365781 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.365793 4918 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.365801 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.365810 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nrb4\" (UniqueName: \"kubernetes.io/projected/7b21f92d-7895-46c0-a66d-9e0aedb15e72-kube-api-access-6nrb4\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.365819 4918 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b21f92d-7895-46c0-a66d-9e0aedb15e72-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.365830 4918 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b21f92d-7895-46c0-a66d-9e0aedb15e72-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.473223 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhmjl\" (UniqueName: \"kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-kube-api-access-hhmjl\") pod \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.474115 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\") pod \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.474146 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config-out\") pod \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.474296 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-0\") pod \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.474379 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-thanos-prometheus-http-client-file\") pod \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.474419 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-tls-assets\") pod \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.474508 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-2\") pod \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.474565 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config\") pod \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.474598 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-web-config\") pod \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.474635 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-1\") pod \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\" (UID: \"08c86067-0c7f-47a2-a2d4-e29ad43c539f\") " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.476053 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "08c86067-0c7f-47a2-a2d4-e29ad43c539f" (UID: "08c86067-0c7f-47a2-a2d4-e29ad43c539f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.476295 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-kube-api-access-hhmjl" (OuterVolumeSpecName: "kube-api-access-hhmjl") pod "08c86067-0c7f-47a2-a2d4-e29ad43c539f" (UID: "08c86067-0c7f-47a2-a2d4-e29ad43c539f"). InnerVolumeSpecName "kube-api-access-hhmjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.477052 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "08c86067-0c7f-47a2-a2d4-e29ad43c539f" (UID: "08c86067-0c7f-47a2-a2d4-e29ad43c539f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.480141 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "08c86067-0c7f-47a2-a2d4-e29ad43c539f" (UID: "08c86067-0c7f-47a2-a2d4-e29ad43c539f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.480175 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config" (OuterVolumeSpecName: "config") pod "08c86067-0c7f-47a2-a2d4-e29ad43c539f" (UID: "08c86067-0c7f-47a2-a2d4-e29ad43c539f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.480220 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config-out" (OuterVolumeSpecName: "config-out") pod "08c86067-0c7f-47a2-a2d4-e29ad43c539f" (UID: "08c86067-0c7f-47a2-a2d4-e29ad43c539f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.480252 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "08c86067-0c7f-47a2-a2d4-e29ad43c539f" (UID: "08c86067-0c7f-47a2-a2d4-e29ad43c539f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.480399 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "08c86067-0c7f-47a2-a2d4-e29ad43c539f" (UID: "08c86067-0c7f-47a2-a2d4-e29ad43c539f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.495142 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "08c86067-0c7f-47a2-a2d4-e29ad43c539f" (UID: "08c86067-0c7f-47a2-a2d4-e29ad43c539f"). InnerVolumeSpecName "pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.527768 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-web-config" (OuterVolumeSpecName: "web-config") pod "08c86067-0c7f-47a2-a2d4-e29ad43c539f" (UID: "08c86067-0c7f-47a2-a2d4-e29ad43c539f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.565965 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4g569-config-w4t6w"] Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.576346 4918 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-web-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.576373 4918 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.576389 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhmjl\" (UniqueName: \"kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-kube-api-access-hhmjl\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.576420 4918 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\") on node \"crc\" " Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.576433 4918 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config-out\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.576444 4918 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.576456 4918 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.576471 4918 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/08c86067-0c7f-47a2-a2d4-e29ad43c539f-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.576482 4918 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/08c86067-0c7f-47a2-a2d4-e29ad43c539f-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.576493 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/08c86067-0c7f-47a2-a2d4-e29ad43c539f-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.611782 4918 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.611956 4918 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3") on node "crc" Mar 19 16:59:32 crc kubenswrapper[4918]: I0319 16:59:32.681490 4918 reconciler_common.go:293] "Volume detached for volume \"pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.008223 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"08c86067-0c7f-47a2-a2d4-e29ad43c539f","Type":"ContainerDied","Data":"c10d05b33949e842790ec76aaadb87bcce31a152a1e46cf78cbd45604476deaa"} Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.008473 4918 scope.go:117] "RemoveContainer" containerID="4dd4082592708c39235b41bad06f5146e140100b57df405a486b3c7197dca754" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.008613 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.013291 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-v6c2z" event={"ID":"bc559952-1f04-4a21-8415-c9c613c5b4d4","Type":"ContainerStarted","Data":"4a48933e0e27353fa09de1f9da9ded8f0beab69610b686d40595fe0cd849630c"} Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.017388 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nmm4p" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.017492 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4g569-config-w4t6w" event={"ID":"3e1300b0-1d38-47c5-a702-676e712859bd","Type":"ContainerStarted","Data":"e4a582f7ff62312f706d66c15c740eed5e7d7415357bc31780ce8208585348e8"} Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.017543 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4g569-config-w4t6w" event={"ID":"3e1300b0-1d38-47c5-a702-676e712859bd","Type":"ContainerStarted","Data":"0cb555519a030bfc84fba6ff419ee87e0c538327d3db1938b81f132e0704a7a3"} Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.036070 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-v6c2z" podStartSLOduration=2.599175212 podStartE2EDuration="17.036049903s" podCreationTimestamp="2026-03-19 16:59:16 +0000 UTC" firstStartedPulling="2026-03-19 16:59:17.541451429 +0000 UTC m=+1169.663650677" lastFinishedPulling="2026-03-19 16:59:31.97832612 +0000 UTC m=+1184.100525368" observedRunningTime="2026-03-19 16:59:33.032715722 +0000 UTC m=+1185.154914970" watchObservedRunningTime="2026-03-19 16:59:33.036049903 +0000 UTC m=+1185.158249151" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.037427 4918 scope.go:117] "RemoveContainer" containerID="687090a707cac81961e45fc05ad69834a9a35c574db6a4f0262bb1914bccea92" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.062570 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.068724 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.076575 4918 scope.go:117] "RemoveContainer" containerID="b8e1cfe90949749e2b18f25ef2bbcf1099b0653e409299865e87c877b9f7c64a" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.095493 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 16:59:33 crc kubenswrapper[4918]: E0319 16:59:33.095911 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="prometheus" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.095928 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="prometheus" Mar 19 16:59:33 crc kubenswrapper[4918]: E0319 16:59:33.095938 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="init-config-reloader" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.095945 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="init-config-reloader" Mar 19 16:59:33 crc kubenswrapper[4918]: E0319 16:59:33.095955 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="config-reloader" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.095961 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="config-reloader" Mar 19 16:59:33 crc kubenswrapper[4918]: E0319 16:59:33.095975 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="thanos-sidecar" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.095981 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="thanos-sidecar" Mar 19 16:59:33 crc kubenswrapper[4918]: E0319 16:59:33.095990 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b21f92d-7895-46c0-a66d-9e0aedb15e72" containerName="swift-ring-rebalance" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.095996 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b21f92d-7895-46c0-a66d-9e0aedb15e72" containerName="swift-ring-rebalance" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.096149 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="config-reloader" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.096159 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="thanos-sidecar" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.096172 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" containerName="prometheus" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.096191 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b21f92d-7895-46c0-a66d-9e0aedb15e72" containerName="swift-ring-rebalance" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.097791 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.099860 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4g569-config-w4t6w" podStartSLOduration=6.099843139 podStartE2EDuration="6.099843139s" podCreationTimestamp="2026-03-19 16:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:33.078194923 +0000 UTC m=+1185.200394171" watchObservedRunningTime="2026-03-19 16:59:33.099843139 +0000 UTC m=+1185.222042387" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.100489 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4lmwq" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.100553 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.100654 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.100773 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.100804 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.100886 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.100918 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.110427 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.110623 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.115812 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.132539 4918 scope.go:117] "RemoveContainer" containerID="c370355f645a1c2288afd62e6eb7c991dca804e2751677e640a89aed254fedea" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.188511 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bc7bt"] Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205386 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce548ee3-59a9-46f9-8b00-06d380b17566-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205434 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205466 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce548ee3-59a9-46f9-8b00-06d380b17566-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205511 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205563 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205636 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205698 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205721 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ce548ee3-59a9-46f9-8b00-06d380b17566-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205740 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce548ee3-59a9-46f9-8b00-06d380b17566-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205778 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsxjj\" (UniqueName: \"kubernetes.io/projected/ce548ee3-59a9-46f9-8b00-06d380b17566-kube-api-access-wsxjj\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205801 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205859 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ce548ee3-59a9-46f9-8b00-06d380b17566-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.205880 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307095 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce548ee3-59a9-46f9-8b00-06d380b17566-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307144 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307164 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce548ee3-59a9-46f9-8b00-06d380b17566-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307200 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307221 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307269 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307316 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307332 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ce548ee3-59a9-46f9-8b00-06d380b17566-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307348 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce548ee3-59a9-46f9-8b00-06d380b17566-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307373 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307390 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxjj\" (UniqueName: \"kubernetes.io/projected/ce548ee3-59a9-46f9-8b00-06d380b17566-kube-api-access-wsxjj\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307425 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ce548ee3-59a9-46f9-8b00-06d380b17566-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.307442 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.308192 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce548ee3-59a9-46f9-8b00-06d380b17566-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.308312 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ce548ee3-59a9-46f9-8b00-06d380b17566-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.310391 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ce548ee3-59a9-46f9-8b00-06d380b17566-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.313573 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.313616 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/532e7b4e9da3b88b0a1999a686ac8b131a30bc0c7a14e430eb07e3f1ae4f0fac/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.314992 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.315063 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.315956 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.315956 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.316504 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce548ee3-59a9-46f9-8b00-06d380b17566-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.318932 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-config\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.319467 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce548ee3-59a9-46f9-8b00-06d380b17566-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.319997 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce548ee3-59a9-46f9-8b00-06d380b17566-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.324660 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxjj\" (UniqueName: \"kubernetes.io/projected/ce548ee3-59a9-46f9-8b00-06d380b17566-kube-api-access-wsxjj\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.363477 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7119ee90-7cbd-4991-aa6c-6a80c4e5a1a3\") pod \"prometheus-metric-storage-0\" (UID: \"ce548ee3-59a9-46f9-8b00-06d380b17566\") " pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.445622 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:33 crc kubenswrapper[4918]: I0319 16:59:33.934852 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 16:59:34 crc kubenswrapper[4918]: I0319 16:59:34.036303 4918 generic.go:334] "Generic (PLEG): container finished" podID="3e1300b0-1d38-47c5-a702-676e712859bd" containerID="e4a582f7ff62312f706d66c15c740eed5e7d7415357bc31780ce8208585348e8" exitCode=0 Mar 19 16:59:34 crc kubenswrapper[4918]: I0319 16:59:34.036387 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4g569-config-w4t6w" event={"ID":"3e1300b0-1d38-47c5-a702-676e712859bd","Type":"ContainerDied","Data":"e4a582f7ff62312f706d66c15c740eed5e7d7415357bc31780ce8208585348e8"} Mar 19 16:59:34 crc kubenswrapper[4918]: I0319 16:59:34.041840 4918 generic.go:334] "Generic (PLEG): container finished" podID="f9377958-db94-4c7d-bc29-37ca8135ba07" containerID="7e49afe0d15d5eb780358ee1a2d83875cd9954654a2764c68154e5ed5992076e" exitCode=0 Mar 19 16:59:34 crc kubenswrapper[4918]: I0319 16:59:34.041976 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bc7bt" event={"ID":"f9377958-db94-4c7d-bc29-37ca8135ba07","Type":"ContainerDied","Data":"7e49afe0d15d5eb780358ee1a2d83875cd9954654a2764c68154e5ed5992076e"} Mar 19 16:59:34 crc kubenswrapper[4918]: I0319 16:59:34.042001 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bc7bt" event={"ID":"f9377958-db94-4c7d-bc29-37ca8135ba07","Type":"ContainerStarted","Data":"35385a85c81131fdcb6f534c097b454e5422cb43b70c6a9085b0967e5294b0bd"} Mar 19 16:59:34 crc kubenswrapper[4918]: I0319 16:59:34.046075 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce548ee3-59a9-46f9-8b00-06d380b17566","Type":"ContainerStarted","Data":"72f53e68266538a192165aefb1909f7ca45de0aaead4a976e4c39a49d282207d"} Mar 19 16:59:34 crc kubenswrapper[4918]: I0319 16:59:34.606215 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c86067-0c7f-47a2-a2d4-e29ad43c539f" path="/var/lib/kubelet/pods/08c86067-0c7f-47a2-a2d4-e29ad43c539f/volumes" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.468655 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bc7bt" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.478790 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.649564 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run-ovn\") pod \"3e1300b0-1d38-47c5-a702-676e712859bd\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.649610 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-log-ovn\") pod \"3e1300b0-1d38-47c5-a702-676e712859bd\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.649663 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl9pq\" (UniqueName: \"kubernetes.io/projected/3e1300b0-1d38-47c5-a702-676e712859bd-kube-api-access-bl9pq\") pod \"3e1300b0-1d38-47c5-a702-676e712859bd\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.649716 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-scripts\") pod \"3e1300b0-1d38-47c5-a702-676e712859bd\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.649780 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9377958-db94-4c7d-bc29-37ca8135ba07-operator-scripts\") pod \"f9377958-db94-4c7d-bc29-37ca8135ba07\" (UID: \"f9377958-db94-4c7d-bc29-37ca8135ba07\") " Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.649808 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run\") pod \"3e1300b0-1d38-47c5-a702-676e712859bd\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.649838 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-additional-scripts\") pod \"3e1300b0-1d38-47c5-a702-676e712859bd\" (UID: \"3e1300b0-1d38-47c5-a702-676e712859bd\") " Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.649919 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvfk4\" (UniqueName: \"kubernetes.io/projected/f9377958-db94-4c7d-bc29-37ca8135ba07-kube-api-access-jvfk4\") pod \"f9377958-db94-4c7d-bc29-37ca8135ba07\" (UID: \"f9377958-db94-4c7d-bc29-37ca8135ba07\") " Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.650716 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3e1300b0-1d38-47c5-a702-676e712859bd" (UID: "3e1300b0-1d38-47c5-a702-676e712859bd"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.650766 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3e1300b0-1d38-47c5-a702-676e712859bd" (UID: "3e1300b0-1d38-47c5-a702-676e712859bd"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.651216 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9377958-db94-4c7d-bc29-37ca8135ba07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9377958-db94-4c7d-bc29-37ca8135ba07" (UID: "f9377958-db94-4c7d-bc29-37ca8135ba07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.651393 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3e1300b0-1d38-47c5-a702-676e712859bd" (UID: "3e1300b0-1d38-47c5-a702-676e712859bd"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.651453 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run" (OuterVolumeSpecName: "var-run") pod "3e1300b0-1d38-47c5-a702-676e712859bd" (UID: "3e1300b0-1d38-47c5-a702-676e712859bd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.651563 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-scripts" (OuterVolumeSpecName: "scripts") pod "3e1300b0-1d38-47c5-a702-676e712859bd" (UID: "3e1300b0-1d38-47c5-a702-676e712859bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.667786 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e1300b0-1d38-47c5-a702-676e712859bd-kube-api-access-bl9pq" (OuterVolumeSpecName: "kube-api-access-bl9pq") pod "3e1300b0-1d38-47c5-a702-676e712859bd" (UID: "3e1300b0-1d38-47c5-a702-676e712859bd"). InnerVolumeSpecName "kube-api-access-bl9pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.672036 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-4g569-config-w4t6w"] Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.679275 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-4g569-config-w4t6w"] Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.679866 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9377958-db94-4c7d-bc29-37ca8135ba07-kube-api-access-jvfk4" (OuterVolumeSpecName: "kube-api-access-jvfk4") pod "f9377958-db94-4c7d-bc29-37ca8135ba07" (UID: "f9377958-db94-4c7d-bc29-37ca8135ba07"). InnerVolumeSpecName "kube-api-access-jvfk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.751963 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl9pq\" (UniqueName: \"kubernetes.io/projected/3e1300b0-1d38-47c5-a702-676e712859bd-kube-api-access-bl9pq\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.752170 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.752260 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9377958-db94-4c7d-bc29-37ca8135ba07-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.752338 4918 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.752405 4918 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1300b0-1d38-47c5-a702-676e712859bd-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.752473 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvfk4\" (UniqueName: \"kubernetes.io/projected/f9377958-db94-4c7d-bc29-37ca8135ba07-kube-api-access-jvfk4\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.752559 4918 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:35 crc kubenswrapper[4918]: I0319 16:59:35.752627 4918 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1300b0-1d38-47c5-a702-676e712859bd-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:36 crc kubenswrapper[4918]: I0319 16:59:36.062478 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cb555519a030bfc84fba6ff419ee87e0c538327d3db1938b81f132e0704a7a3" Mar 19 16:59:36 crc kubenswrapper[4918]: I0319 16:59:36.062492 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4g569-config-w4t6w" Mar 19 16:59:36 crc kubenswrapper[4918]: I0319 16:59:36.064761 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bc7bt" event={"ID":"f9377958-db94-4c7d-bc29-37ca8135ba07","Type":"ContainerDied","Data":"35385a85c81131fdcb6f534c097b454e5422cb43b70c6a9085b0967e5294b0bd"} Mar 19 16:59:36 crc kubenswrapper[4918]: I0319 16:59:36.064800 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bc7bt" Mar 19 16:59:36 crc kubenswrapper[4918]: I0319 16:59:36.064805 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35385a85c81131fdcb6f534c097b454e5422cb43b70c6a9085b0967e5294b0bd" Mar 19 16:59:36 crc kubenswrapper[4918]: I0319 16:59:36.261189 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:36 crc kubenswrapper[4918]: I0319 16:59:36.269330 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4118384-38ad-465d-a81e-62bf39cc6cec-etc-swift\") pod \"swift-storage-0\" (UID: \"e4118384-38ad-465d-a81e-62bf39cc6cec\") " pod="openstack/swift-storage-0" Mar 19 16:59:36 crc kubenswrapper[4918]: I0319 16:59:36.413397 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 16:59:36 crc kubenswrapper[4918]: I0319 16:59:36.600917 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e1300b0-1d38-47c5-a702-676e712859bd" path="/var/lib/kubelet/pods/3e1300b0-1d38-47c5-a702-676e712859bd/volumes" Mar 19 16:59:36 crc kubenswrapper[4918]: I0319 16:59:36.942854 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-4g569" Mar 19 16:59:36 crc kubenswrapper[4918]: I0319 16:59:36.966259 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 16:59:37 crc kubenswrapper[4918]: I0319 16:59:37.073718 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce548ee3-59a9-46f9-8b00-06d380b17566","Type":"ContainerStarted","Data":"82241a68ca495fe72e8395dcfea9401a73687f64a12b43a2a131dd80dd7a0e9c"} Mar 19 16:59:37 crc kubenswrapper[4918]: I0319 16:59:37.074606 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"73d056beee3a3467d335be27e04cddcc088fe9d799ab51489e4d315013e1784a"} Mar 19 16:59:37 crc kubenswrapper[4918]: I0319 16:59:37.755843 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 16:59:38 crc kubenswrapper[4918]: I0319 16:59:38.112069 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.099873 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"ddc1655f5efd737abdf1c46e33582c6bd93b47fd50612cab06f87c5a3894cffd"} Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.100452 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"7a7c869405176c935a5362039005567d6f15b063624726b8a7ccb9265f4ab636"} Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.100470 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"f2cc5456b931c54c3183e67fbc02f5230ee6fd9ffb99d633de3ccc40b5dcfab4"} Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.100485 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"7bb0babdd497fec8f63f3741d9eb5b6b561a4e25e70a30aec435f4b8ed4dddd7"} Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.748782 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6x9dg"] Mar 19 16:59:39 crc kubenswrapper[4918]: E0319 16:59:39.749170 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1300b0-1d38-47c5-a702-676e712859bd" containerName="ovn-config" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.749191 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1300b0-1d38-47c5-a702-676e712859bd" containerName="ovn-config" Mar 19 16:59:39 crc kubenswrapper[4918]: E0319 16:59:39.749212 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9377958-db94-4c7d-bc29-37ca8135ba07" containerName="mariadb-account-create-update" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.749220 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9377958-db94-4c7d-bc29-37ca8135ba07" containerName="mariadb-account-create-update" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.749425 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9377958-db94-4c7d-bc29-37ca8135ba07" containerName="mariadb-account-create-update" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.749446 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1300b0-1d38-47c5-a702-676e712859bd" containerName="ovn-config" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.750169 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6x9dg" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.770975 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6x9dg"] Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.822184 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szz8f\" (UniqueName: \"kubernetes.io/projected/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-kube-api-access-szz8f\") pod \"cinder-db-create-6x9dg\" (UID: \"c909d8df-c118-4ff6-8c06-c0d3f71be4cf\") " pod="openstack/cinder-db-create-6x9dg" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.822250 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-operator-scripts\") pod \"cinder-db-create-6x9dg\" (UID: \"c909d8df-c118-4ff6-8c06-c0d3f71be4cf\") " pod="openstack/cinder-db-create-6x9dg" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.866633 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-g2mz7"] Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.868008 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-g2mz7" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.884016 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5103-account-create-update-t64cp"] Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.885221 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5103-account-create-update-t64cp" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.894149 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.923799 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szz8f\" (UniqueName: \"kubernetes.io/projected/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-kube-api-access-szz8f\") pod \"cinder-db-create-6x9dg\" (UID: \"c909d8df-c118-4ff6-8c06-c0d3f71be4cf\") " pod="openstack/cinder-db-create-6x9dg" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.923875 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-operator-scripts\") pod \"cinder-db-create-6x9dg\" (UID: \"c909d8df-c118-4ff6-8c06-c0d3f71be4cf\") " pod="openstack/cinder-db-create-6x9dg" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.924866 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-operator-scripts\") pod \"cinder-db-create-6x9dg\" (UID: \"c909d8df-c118-4ff6-8c06-c0d3f71be4cf\") " pod="openstack/cinder-db-create-6x9dg" Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.930264 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-g2mz7"] Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.947659 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5103-account-create-update-t64cp"] Mar 19 16:59:39 crc kubenswrapper[4918]: I0319 16:59:39.959459 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szz8f\" (UniqueName: \"kubernetes.io/projected/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-kube-api-access-szz8f\") pod \"cinder-db-create-6x9dg\" (UID: \"c909d8df-c118-4ff6-8c06-c0d3f71be4cf\") " pod="openstack/cinder-db-create-6x9dg" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.025940 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbd67\" (UniqueName: \"kubernetes.io/projected/fda7a707-ab32-455f-8d42-bd371c95e9d2-kube-api-access-lbd67\") pod \"cloudkitty-db-create-g2mz7\" (UID: \"fda7a707-ab32-455f-8d42-bd371c95e9d2\") " pod="openstack/cloudkitty-db-create-g2mz7" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.026083 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-operator-scripts\") pod \"cinder-5103-account-create-update-t64cp\" (UID: \"543581b9-3f29-4718-bf4b-a4eaa3fb4b39\") " pod="openstack/cinder-5103-account-create-update-t64cp" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.026124 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj9k5\" (UniqueName: \"kubernetes.io/projected/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-kube-api-access-zj9k5\") pod \"cinder-5103-account-create-update-t64cp\" (UID: \"543581b9-3f29-4718-bf4b-a4eaa3fb4b39\") " pod="openstack/cinder-5103-account-create-update-t64cp" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.026153 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fda7a707-ab32-455f-8d42-bd371c95e9d2-operator-scripts\") pod \"cloudkitty-db-create-g2mz7\" (UID: \"fda7a707-ab32-455f-8d42-bd371c95e9d2\") " pod="openstack/cloudkitty-db-create-g2mz7" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.072114 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6x9dg" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.128653 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-operator-scripts\") pod \"cinder-5103-account-create-update-t64cp\" (UID: \"543581b9-3f29-4718-bf4b-a4eaa3fb4b39\") " pod="openstack/cinder-5103-account-create-update-t64cp" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.128737 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj9k5\" (UniqueName: \"kubernetes.io/projected/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-kube-api-access-zj9k5\") pod \"cinder-5103-account-create-update-t64cp\" (UID: \"543581b9-3f29-4718-bf4b-a4eaa3fb4b39\") " pod="openstack/cinder-5103-account-create-update-t64cp" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.128779 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fda7a707-ab32-455f-8d42-bd371c95e9d2-operator-scripts\") pod \"cloudkitty-db-create-g2mz7\" (UID: \"fda7a707-ab32-455f-8d42-bd371c95e9d2\") " pod="openstack/cloudkitty-db-create-g2mz7" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.128901 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbd67\" (UniqueName: \"kubernetes.io/projected/fda7a707-ab32-455f-8d42-bd371c95e9d2-kube-api-access-lbd67\") pod \"cloudkitty-db-create-g2mz7\" (UID: \"fda7a707-ab32-455f-8d42-bd371c95e9d2\") " pod="openstack/cloudkitty-db-create-g2mz7" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.129634 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-operator-scripts\") pod \"cinder-5103-account-create-update-t64cp\" (UID: \"543581b9-3f29-4718-bf4b-a4eaa3fb4b39\") " pod="openstack/cinder-5103-account-create-update-t64cp" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.130266 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-6cdsd"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.131855 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fda7a707-ab32-455f-8d42-bd371c95e9d2-operator-scripts\") pod \"cloudkitty-db-create-g2mz7\" (UID: \"fda7a707-ab32-455f-8d42-bd371c95e9d2\") " pod="openstack/cloudkitty-db-create-g2mz7" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.132303 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.137896 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d72v8" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.140338 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.141745 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.145918 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.167537 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6cdsd"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.172189 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbd67\" (UniqueName: \"kubernetes.io/projected/fda7a707-ab32-455f-8d42-bd371c95e9d2-kube-api-access-lbd67\") pod \"cloudkitty-db-create-g2mz7\" (UID: \"fda7a707-ab32-455f-8d42-bd371c95e9d2\") " pod="openstack/cloudkitty-db-create-g2mz7" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.178487 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj9k5\" (UniqueName: \"kubernetes.io/projected/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-kube-api-access-zj9k5\") pod \"cinder-5103-account-create-update-t64cp\" (UID: \"543581b9-3f29-4718-bf4b-a4eaa3fb4b39\") " pod="openstack/cinder-5103-account-create-update-t64cp" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.185950 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-g2mz7" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.213531 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5103-account-create-update-t64cp" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.228852 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-da83-account-create-update-xc7wt"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.230302 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-da83-account-create-update-xc7wt" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.232124 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-config-data\") pod \"keystone-db-sync-6cdsd\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.232174 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-combined-ca-bundle\") pod \"keystone-db-sync-6cdsd\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.232208 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr5q2\" (UniqueName: \"kubernetes.io/projected/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-kube-api-access-vr5q2\") pod \"keystone-db-sync-6cdsd\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.240868 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.278932 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-da83-account-create-update-xc7wt"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.291655 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-54nhz"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.310180 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-54nhz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.334917 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7fvg\" (UniqueName: \"kubernetes.io/projected/0f77a167-549a-441f-b185-977ccb2195ab-kube-api-access-f7fvg\") pod \"cloudkitty-da83-account-create-update-xc7wt\" (UID: \"0f77a167-549a-441f-b185-977ccb2195ab\") " pod="openstack/cloudkitty-da83-account-create-update-xc7wt" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.334997 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f77a167-549a-441f-b185-977ccb2195ab-operator-scripts\") pod \"cloudkitty-da83-account-create-update-xc7wt\" (UID: \"0f77a167-549a-441f-b185-977ccb2195ab\") " pod="openstack/cloudkitty-da83-account-create-update-xc7wt" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.335094 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-config-data\") pod \"keystone-db-sync-6cdsd\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.335119 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-combined-ca-bundle\") pod \"keystone-db-sync-6cdsd\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.335139 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr5q2\" (UniqueName: \"kubernetes.io/projected/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-kube-api-access-vr5q2\") pod \"keystone-db-sync-6cdsd\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.416350 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr5q2\" (UniqueName: \"kubernetes.io/projected/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-kube-api-access-vr5q2\") pod \"keystone-db-sync-6cdsd\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.449201 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-combined-ca-bundle\") pod \"keystone-db-sync-6cdsd\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.478621 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadbc2dd-325c-4390-9ea6-bc827cec049d-operator-scripts\") pod \"neutron-db-create-54nhz\" (UID: \"dadbc2dd-325c-4390-9ea6-bc827cec049d\") " pod="openstack/neutron-db-create-54nhz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.478720 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7fvg\" (UniqueName: \"kubernetes.io/projected/0f77a167-549a-441f-b185-977ccb2195ab-kube-api-access-f7fvg\") pod \"cloudkitty-da83-account-create-update-xc7wt\" (UID: \"0f77a167-549a-441f-b185-977ccb2195ab\") " pod="openstack/cloudkitty-da83-account-create-update-xc7wt" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.478783 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfjbw\" (UniqueName: \"kubernetes.io/projected/dadbc2dd-325c-4390-9ea6-bc827cec049d-kube-api-access-gfjbw\") pod \"neutron-db-create-54nhz\" (UID: \"dadbc2dd-325c-4390-9ea6-bc827cec049d\") " pod="openstack/neutron-db-create-54nhz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.478923 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f77a167-549a-441f-b185-977ccb2195ab-operator-scripts\") pod \"cloudkitty-da83-account-create-update-xc7wt\" (UID: \"0f77a167-549a-441f-b185-977ccb2195ab\") " pod="openstack/cloudkitty-da83-account-create-update-xc7wt" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.480823 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f77a167-549a-441f-b185-977ccb2195ab-operator-scripts\") pod \"cloudkitty-da83-account-create-update-xc7wt\" (UID: \"0f77a167-549a-441f-b185-977ccb2195ab\") " pod="openstack/cloudkitty-da83-account-create-update-xc7wt" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.481379 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-config-data\") pod \"keystone-db-sync-6cdsd\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.524940 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7fvg\" (UniqueName: \"kubernetes.io/projected/0f77a167-549a-441f-b185-977ccb2195ab-kube-api-access-f7fvg\") pod \"cloudkitty-da83-account-create-update-xc7wt\" (UID: \"0f77a167-549a-441f-b185-977ccb2195ab\") " pod="openstack/cloudkitty-da83-account-create-update-xc7wt" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.573627 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e0e3-account-create-update-gpntj"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.574689 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e0e3-account-create-update-gpntj" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.580301 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.582910 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9f94\" (UniqueName: \"kubernetes.io/projected/e9ab1e31-bea9-4b23-899c-0b818c121f65-kube-api-access-z9f94\") pod \"barbican-e0e3-account-create-update-gpntj\" (UID: \"e9ab1e31-bea9-4b23-899c-0b818c121f65\") " pod="openstack/barbican-e0e3-account-create-update-gpntj" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.582973 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadbc2dd-325c-4390-9ea6-bc827cec049d-operator-scripts\") pod \"neutron-db-create-54nhz\" (UID: \"dadbc2dd-325c-4390-9ea6-bc827cec049d\") " pod="openstack/neutron-db-create-54nhz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.583006 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab1e31-bea9-4b23-899c-0b818c121f65-operator-scripts\") pod \"barbican-e0e3-account-create-update-gpntj\" (UID: \"e9ab1e31-bea9-4b23-899c-0b818c121f65\") " pod="openstack/barbican-e0e3-account-create-update-gpntj" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.583028 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfjbw\" (UniqueName: \"kubernetes.io/projected/dadbc2dd-325c-4390-9ea6-bc827cec049d-kube-api-access-gfjbw\") pod \"neutron-db-create-54nhz\" (UID: \"dadbc2dd-325c-4390-9ea6-bc827cec049d\") " pod="openstack/neutron-db-create-54nhz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.584654 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadbc2dd-325c-4390-9ea6-bc827cec049d-operator-scripts\") pod \"neutron-db-create-54nhz\" (UID: \"dadbc2dd-325c-4390-9ea6-bc827cec049d\") " pod="openstack/neutron-db-create-54nhz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.606202 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-54nhz"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.606808 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-da83-account-create-update-xc7wt" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.620374 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfjbw\" (UniqueName: \"kubernetes.io/projected/dadbc2dd-325c-4390-9ea6-bc827cec049d-kube-api-access-gfjbw\") pod \"neutron-db-create-54nhz\" (UID: \"dadbc2dd-325c-4390-9ea6-bc827cec049d\") " pod="openstack/neutron-db-create-54nhz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.630371 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e0e3-account-create-update-gpntj"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.635078 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-54nhz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.650930 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-zk5wz"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.652202 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zk5wz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.681382 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ea0b-account-create-update-l66jx"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.683305 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ea0b-account-create-update-l66jx" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.686095 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9f94\" (UniqueName: \"kubernetes.io/projected/e9ab1e31-bea9-4b23-899c-0b818c121f65-kube-api-access-z9f94\") pod \"barbican-e0e3-account-create-update-gpntj\" (UID: \"e9ab1e31-bea9-4b23-899c-0b818c121f65\") " pod="openstack/barbican-e0e3-account-create-update-gpntj" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.686194 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab1e31-bea9-4b23-899c-0b818c121f65-operator-scripts\") pod \"barbican-e0e3-account-create-update-gpntj\" (UID: \"e9ab1e31-bea9-4b23-899c-0b818c121f65\") " pod="openstack/barbican-e0e3-account-create-update-gpntj" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.686949 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab1e31-bea9-4b23-899c-0b818c121f65-operator-scripts\") pod \"barbican-e0e3-account-create-update-gpntj\" (UID: \"e9ab1e31-bea9-4b23-899c-0b818c121f65\") " pod="openstack/barbican-e0e3-account-create-update-gpntj" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.687160 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.710098 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zk5wz"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.714639 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ea0b-account-create-update-l66jx"] Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.719984 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.732842 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9f94\" (UniqueName: \"kubernetes.io/projected/e9ab1e31-bea9-4b23-899c-0b818c121f65-kube-api-access-z9f94\") pod \"barbican-e0e3-account-create-update-gpntj\" (UID: \"e9ab1e31-bea9-4b23-899c-0b818c121f65\") " pod="openstack/barbican-e0e3-account-create-update-gpntj" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.787668 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxqzc\" (UniqueName: \"kubernetes.io/projected/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-kube-api-access-jxqzc\") pod \"neutron-ea0b-account-create-update-l66jx\" (UID: \"86c8c8a7-6bd3-4169-88ac-9a5838c526c2\") " pod="openstack/neutron-ea0b-account-create-update-l66jx" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.787720 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-operator-scripts\") pod \"neutron-ea0b-account-create-update-l66jx\" (UID: \"86c8c8a7-6bd3-4169-88ac-9a5838c526c2\") " pod="openstack/neutron-ea0b-account-create-update-l66jx" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.787799 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25tk7\" (UniqueName: \"kubernetes.io/projected/ad8cc411-f838-439a-9993-e53b431dcd28-kube-api-access-25tk7\") pod \"barbican-db-create-zk5wz\" (UID: \"ad8cc411-f838-439a-9993-e53b431dcd28\") " pod="openstack/barbican-db-create-zk5wz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.787822 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad8cc411-f838-439a-9993-e53b431dcd28-operator-scripts\") pod \"barbican-db-create-zk5wz\" (UID: \"ad8cc411-f838-439a-9993-e53b431dcd28\") " pod="openstack/barbican-db-create-zk5wz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.889233 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxqzc\" (UniqueName: \"kubernetes.io/projected/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-kube-api-access-jxqzc\") pod \"neutron-ea0b-account-create-update-l66jx\" (UID: \"86c8c8a7-6bd3-4169-88ac-9a5838c526c2\") " pod="openstack/neutron-ea0b-account-create-update-l66jx" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.889575 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-operator-scripts\") pod \"neutron-ea0b-account-create-update-l66jx\" (UID: \"86c8c8a7-6bd3-4169-88ac-9a5838c526c2\") " pod="openstack/neutron-ea0b-account-create-update-l66jx" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.889682 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25tk7\" (UniqueName: \"kubernetes.io/projected/ad8cc411-f838-439a-9993-e53b431dcd28-kube-api-access-25tk7\") pod \"barbican-db-create-zk5wz\" (UID: \"ad8cc411-f838-439a-9993-e53b431dcd28\") " pod="openstack/barbican-db-create-zk5wz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.889721 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad8cc411-f838-439a-9993-e53b431dcd28-operator-scripts\") pod \"barbican-db-create-zk5wz\" (UID: \"ad8cc411-f838-439a-9993-e53b431dcd28\") " pod="openstack/barbican-db-create-zk5wz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.890493 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad8cc411-f838-439a-9993-e53b431dcd28-operator-scripts\") pod \"barbican-db-create-zk5wz\" (UID: \"ad8cc411-f838-439a-9993-e53b431dcd28\") " pod="openstack/barbican-db-create-zk5wz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.890679 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-operator-scripts\") pod \"neutron-ea0b-account-create-update-l66jx\" (UID: \"86c8c8a7-6bd3-4169-88ac-9a5838c526c2\") " pod="openstack/neutron-ea0b-account-create-update-l66jx" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.913378 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25tk7\" (UniqueName: \"kubernetes.io/projected/ad8cc411-f838-439a-9993-e53b431dcd28-kube-api-access-25tk7\") pod \"barbican-db-create-zk5wz\" (UID: \"ad8cc411-f838-439a-9993-e53b431dcd28\") " pod="openstack/barbican-db-create-zk5wz" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.929172 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxqzc\" (UniqueName: \"kubernetes.io/projected/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-kube-api-access-jxqzc\") pod \"neutron-ea0b-account-create-update-l66jx\" (UID: \"86c8c8a7-6bd3-4169-88ac-9a5838c526c2\") " pod="openstack/neutron-ea0b-account-create-update-l66jx" Mar 19 16:59:40 crc kubenswrapper[4918]: I0319 16:59:40.953333 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e0e3-account-create-update-gpntj" Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.039917 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zk5wz" Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.056516 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ea0b-account-create-update-l66jx" Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.106681 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6x9dg"] Mar 19 16:59:41 crc kubenswrapper[4918]: W0319 16:59:41.125584 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc909d8df_c118_4ff6_8c06_c0d3f71be4cf.slice/crio-6ec62d0e2d6d439e756d5a69ce44602d655fb2f6e15cc14de32c6aead61b5814 WatchSource:0}: Error finding container 6ec62d0e2d6d439e756d5a69ce44602d655fb2f6e15cc14de32c6aead61b5814: Status 404 returned error can't find the container with id 6ec62d0e2d6d439e756d5a69ce44602d655fb2f6e15cc14de32c6aead61b5814 Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.158383 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6x9dg" event={"ID":"c909d8df-c118-4ff6-8c06-c0d3f71be4cf","Type":"ContainerStarted","Data":"6ec62d0e2d6d439e756d5a69ce44602d655fb2f6e15cc14de32c6aead61b5814"} Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.262268 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-g2mz7"] Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.271370 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5103-account-create-update-t64cp"] Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.386547 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-54nhz"] Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.501270 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.558628 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-da83-account-create-update-xc7wt"] Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.568405 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6cdsd"] Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.708217 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e0e3-account-create-update-gpntj"] Mar 19 16:59:41 crc kubenswrapper[4918]: I0319 16:59:41.836888 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zk5wz"] Mar 19 16:59:41 crc kubenswrapper[4918]: W0319 16:59:41.873671 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70226cc0_a6ae_4454_8e20_f85b06e2ee2d.slice/crio-a8443c565be899627ac5ae2704ec94f7f1fb8c571d209713c890c8e891921ee2 WatchSource:0}: Error finding container a8443c565be899627ac5ae2704ec94f7f1fb8c571d209713c890c8e891921ee2: Status 404 returned error can't find the container with id a8443c565be899627ac5ae2704ec94f7f1fb8c571d209713c890c8e891921ee2 Mar 19 16:59:41 crc kubenswrapper[4918]: W0319 16:59:41.875931 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9ab1e31_bea9_4b23_899c_0b818c121f65.slice/crio-ca30aff1a4c1d8968175cff119c4e34b76ce9f723f7e428142b02bd619359277 WatchSource:0}: Error finding container ca30aff1a4c1d8968175cff119c4e34b76ce9f723f7e428142b02bd619359277: Status 404 returned error can't find the container with id ca30aff1a4c1d8968175cff119c4e34b76ce9f723f7e428142b02bd619359277 Mar 19 16:59:41 crc kubenswrapper[4918]: W0319 16:59:41.881008 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfda7a707_ab32_455f_8d42_bd371c95e9d2.slice/crio-e52cb3bd47696542068deac6ce92eb03b0a7fdf3510aeed027cc08511583b024 WatchSource:0}: Error finding container e52cb3bd47696542068deac6ce92eb03b0a7fdf3510aeed027cc08511583b024: Status 404 returned error can't find the container with id e52cb3bd47696542068deac6ce92eb03b0a7fdf3510aeed027cc08511583b024 Mar 19 16:59:41 crc kubenswrapper[4918]: W0319 16:59:41.884397 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddadbc2dd_325c_4390_9ea6_bc827cec049d.slice/crio-bba8e74590413a891c4037d64a1af79acf87244e8504184a91b5a207fae19bdb WatchSource:0}: Error finding container bba8e74590413a891c4037d64a1af79acf87244e8504184a91b5a207fae19bdb: Status 404 returned error can't find the container with id bba8e74590413a891c4037d64a1af79acf87244e8504184a91b5a207fae19bdb Mar 19 16:59:41 crc kubenswrapper[4918]: W0319 16:59:41.889122 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f77a167_549a_441f_b185_977ccb2195ab.slice/crio-a6500f832f9bf3404482ba66a81f1cefaccc5af3f1df18e484d871a0b393b8dd WatchSource:0}: Error finding container a6500f832f9bf3404482ba66a81f1cefaccc5af3f1df18e484d871a0b393b8dd: Status 404 returned error can't find the container with id a6500f832f9bf3404482ba66a81f1cefaccc5af3f1df18e484d871a0b393b8dd Mar 19 16:59:41 crc kubenswrapper[4918]: W0319 16:59:41.897776 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod543581b9_3f29_4718_bf4b_a4eaa3fb4b39.slice/crio-5dcab61106bbcf7f888f8ccac38e37043a8ebe79a705cc42cfc1a1de707d5c6d WatchSource:0}: Error finding container 5dcab61106bbcf7f888f8ccac38e37043a8ebe79a705cc42cfc1a1de707d5c6d: Status 404 returned error can't find the container with id 5dcab61106bbcf7f888f8ccac38e37043a8ebe79a705cc42cfc1a1de707d5c6d Mar 19 16:59:42 crc kubenswrapper[4918]: I0319 16:59:42.175602 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-54nhz" event={"ID":"dadbc2dd-325c-4390-9ea6-bc827cec049d","Type":"ContainerStarted","Data":"bba8e74590413a891c4037d64a1af79acf87244e8504184a91b5a207fae19bdb"} Mar 19 16:59:42 crc kubenswrapper[4918]: I0319 16:59:42.179566 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6cdsd" event={"ID":"70226cc0-a6ae-4454-8e20-f85b06e2ee2d","Type":"ContainerStarted","Data":"a8443c565be899627ac5ae2704ec94f7f1fb8c571d209713c890c8e891921ee2"} Mar 19 16:59:42 crc kubenswrapper[4918]: I0319 16:59:42.181205 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-da83-account-create-update-xc7wt" event={"ID":"0f77a167-549a-441f-b185-977ccb2195ab","Type":"ContainerStarted","Data":"a6500f832f9bf3404482ba66a81f1cefaccc5af3f1df18e484d871a0b393b8dd"} Mar 19 16:59:42 crc kubenswrapper[4918]: I0319 16:59:42.184866 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zk5wz" event={"ID":"ad8cc411-f838-439a-9993-e53b431dcd28","Type":"ContainerStarted","Data":"39c19c099c7c62c057255e1d1688b4a4bdca921feac48a75c9b195f8fa87f7be"} Mar 19 16:59:42 crc kubenswrapper[4918]: I0319 16:59:42.188746 4918 generic.go:334] "Generic (PLEG): container finished" podID="ce548ee3-59a9-46f9-8b00-06d380b17566" containerID="82241a68ca495fe72e8395dcfea9401a73687f64a12b43a2a131dd80dd7a0e9c" exitCode=0 Mar 19 16:59:42 crc kubenswrapper[4918]: I0319 16:59:42.188796 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce548ee3-59a9-46f9-8b00-06d380b17566","Type":"ContainerDied","Data":"82241a68ca495fe72e8395dcfea9401a73687f64a12b43a2a131dd80dd7a0e9c"} Mar 19 16:59:42 crc kubenswrapper[4918]: I0319 16:59:42.197688 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-g2mz7" event={"ID":"fda7a707-ab32-455f-8d42-bd371c95e9d2","Type":"ContainerStarted","Data":"e52cb3bd47696542068deac6ce92eb03b0a7fdf3510aeed027cc08511583b024"} Mar 19 16:59:42 crc kubenswrapper[4918]: I0319 16:59:42.200845 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e0e3-account-create-update-gpntj" event={"ID":"e9ab1e31-bea9-4b23-899c-0b818c121f65","Type":"ContainerStarted","Data":"ca30aff1a4c1d8968175cff119c4e34b76ce9f723f7e428142b02bd619359277"} Mar 19 16:59:42 crc kubenswrapper[4918]: I0319 16:59:42.221449 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5103-account-create-update-t64cp" event={"ID":"543581b9-3f29-4718-bf4b-a4eaa3fb4b39","Type":"ContainerStarted","Data":"5dcab61106bbcf7f888f8ccac38e37043a8ebe79a705cc42cfc1a1de707d5c6d"} Mar 19 16:59:42 crc kubenswrapper[4918]: I0319 16:59:42.418776 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ea0b-account-create-update-l66jx"] Mar 19 16:59:42 crc kubenswrapper[4918]: W0319 16:59:42.487081 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86c8c8a7_6bd3_4169_88ac_9a5838c526c2.slice/crio-2f0afe17fd38271fa7312c6eef9941d197b86c37e35352dab96baf772912cee6 WatchSource:0}: Error finding container 2f0afe17fd38271fa7312c6eef9941d197b86c37e35352dab96baf772912cee6: Status 404 returned error can't find the container with id 2f0afe17fd38271fa7312c6eef9941d197b86c37e35352dab96baf772912cee6 Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.241959 4918 generic.go:334] "Generic (PLEG): container finished" podID="ad8cc411-f838-439a-9993-e53b431dcd28" containerID="8f6340081e4f0336fe36ff9d5be0e865325ece97c27d83ac5c359ccf718d70cd" exitCode=0 Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.242820 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zk5wz" event={"ID":"ad8cc411-f838-439a-9993-e53b431dcd28","Type":"ContainerDied","Data":"8f6340081e4f0336fe36ff9d5be0e865325ece97c27d83ac5c359ccf718d70cd"} Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.259882 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-g2mz7" event={"ID":"fda7a707-ab32-455f-8d42-bd371c95e9d2","Type":"ContainerStarted","Data":"97a21f9941aa6f2460a365553ecdf95b72e8ad797cc1f1b82c7872361bdd136c"} Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.275543 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e0e3-account-create-update-gpntj" event={"ID":"e9ab1e31-bea9-4b23-899c-0b818c121f65","Type":"ContainerStarted","Data":"32cdd41f81602138845c90016f85b37554af18208fd8596a57f201cfd38bddd8"} Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.277588 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ea0b-account-create-update-l66jx" event={"ID":"86c8c8a7-6bd3-4169-88ac-9a5838c526c2","Type":"ContainerStarted","Data":"692d053ee950319107e100f82a93633073c285df427d10ad3d062c7e8cb668b6"} Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.277636 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ea0b-account-create-update-l66jx" event={"ID":"86c8c8a7-6bd3-4169-88ac-9a5838c526c2","Type":"ContainerStarted","Data":"2f0afe17fd38271fa7312c6eef9941d197b86c37e35352dab96baf772912cee6"} Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.281261 4918 generic.go:334] "Generic (PLEG): container finished" podID="c909d8df-c118-4ff6-8c06-c0d3f71be4cf" containerID="85c9d4b2b89e7d30dcb5701c311a1b8e2e9c90ade226b376f9ee69116c2f9711" exitCode=0 Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.281332 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6x9dg" event={"ID":"c909d8df-c118-4ff6-8c06-c0d3f71be4cf","Type":"ContainerDied","Data":"85c9d4b2b89e7d30dcb5701c311a1b8e2e9c90ade226b376f9ee69116c2f9711"} Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.294500 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"ae9493b3d3f0f4b9617130b4b704b03af828573e0450e5b6dbc8ba7773cc701e"} Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.297534 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-create-g2mz7" podStartSLOduration=4.297495937 podStartE2EDuration="4.297495937s" podCreationTimestamp="2026-03-19 16:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:43.287186904 +0000 UTC m=+1195.409386152" watchObservedRunningTime="2026-03-19 16:59:43.297495937 +0000 UTC m=+1195.419695185" Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.298708 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce548ee3-59a9-46f9-8b00-06d380b17566","Type":"ContainerStarted","Data":"398f289eca152840b87e0cd31c63773a1794651434bc0c4878969281c2087acd"} Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.300469 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5103-account-create-update-t64cp" event={"ID":"543581b9-3f29-4718-bf4b-a4eaa3fb4b39","Type":"ContainerStarted","Data":"0e116879497271fbf86f1c3d662d1b8e1ea280943fe7c54b7ecb303deb4589cc"} Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.301900 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-54nhz" event={"ID":"dadbc2dd-325c-4390-9ea6-bc827cec049d","Type":"ContainerStarted","Data":"d45b235408e96a70149dddbfea3d61222f32c14de4bb00da8bf32715a980d435"} Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.303921 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-da83-account-create-update-xc7wt" event={"ID":"0f77a167-549a-441f-b185-977ccb2195ab","Type":"ContainerStarted","Data":"704b834d69a79dd7b2c6a1926510ceaca88d7bbf10e95f1b61e7a4c2f69b7589"} Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.375939 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-da83-account-create-update-xc7wt" podStartSLOduration=3.3759163660000002 podStartE2EDuration="3.375916366s" podCreationTimestamp="2026-03-19 16:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:43.355833953 +0000 UTC m=+1195.478033201" watchObservedRunningTime="2026-03-19 16:59:43.375916366 +0000 UTC m=+1195.498115614" Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.394887 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ea0b-account-create-update-l66jx" podStartSLOduration=3.394867908 podStartE2EDuration="3.394867908s" podCreationTimestamp="2026-03-19 16:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:43.373912941 +0000 UTC m=+1195.496112189" watchObservedRunningTime="2026-03-19 16:59:43.394867908 +0000 UTC m=+1195.517067156" Mar 19 16:59:43 crc kubenswrapper[4918]: I0319 16:59:43.400224 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-54nhz" podStartSLOduration=3.400202414 podStartE2EDuration="3.400202414s" podCreationTimestamp="2026-03-19 16:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:43.391268698 +0000 UTC m=+1195.513467946" watchObservedRunningTime="2026-03-19 16:59:43.400202414 +0000 UTC m=+1195.522401662" Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.314558 4918 generic.go:334] "Generic (PLEG): container finished" podID="bc559952-1f04-4a21-8415-c9c613c5b4d4" containerID="4a48933e0e27353fa09de1f9da9ded8f0beab69610b686d40595fe0cd849630c" exitCode=0 Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.314689 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-v6c2z" event={"ID":"bc559952-1f04-4a21-8415-c9c613c5b4d4","Type":"ContainerDied","Data":"4a48933e0e27353fa09de1f9da9ded8f0beab69610b686d40595fe0cd849630c"} Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.317828 4918 generic.go:334] "Generic (PLEG): container finished" podID="dadbc2dd-325c-4390-9ea6-bc827cec049d" containerID="d45b235408e96a70149dddbfea3d61222f32c14de4bb00da8bf32715a980d435" exitCode=0 Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.317993 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-54nhz" event={"ID":"dadbc2dd-325c-4390-9ea6-bc827cec049d","Type":"ContainerDied","Data":"d45b235408e96a70149dddbfea3d61222f32c14de4bb00da8bf32715a980d435"} Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.321127 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"63e16fbbea26070fbd8186eb3f3accdcbdb8746331221981b08abd14f5b64680"} Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.321159 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"ea22cb1561b475cde946a9377fca34acdf85150ebcfde4b6c61ed69448eb6817"} Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.321167 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"4083b91cbcb0193ddcd036352fdcb681fe5bd1d1da64f94b6c53fe5b858fb9f1"} Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.322433 4918 generic.go:334] "Generic (PLEG): container finished" podID="0f77a167-549a-441f-b185-977ccb2195ab" containerID="704b834d69a79dd7b2c6a1926510ceaca88d7bbf10e95f1b61e7a4c2f69b7589" exitCode=0 Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.322477 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-da83-account-create-update-xc7wt" event={"ID":"0f77a167-549a-441f-b185-977ccb2195ab","Type":"ContainerDied","Data":"704b834d69a79dd7b2c6a1926510ceaca88d7bbf10e95f1b61e7a4c2f69b7589"} Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.323698 4918 generic.go:334] "Generic (PLEG): container finished" podID="fda7a707-ab32-455f-8d42-bd371c95e9d2" containerID="97a21f9941aa6f2460a365553ecdf95b72e8ad797cc1f1b82c7872361bdd136c" exitCode=0 Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.323741 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-g2mz7" event={"ID":"fda7a707-ab32-455f-8d42-bd371c95e9d2","Type":"ContainerDied","Data":"97a21f9941aa6f2460a365553ecdf95b72e8ad797cc1f1b82c7872361bdd136c"} Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.329146 4918 generic.go:334] "Generic (PLEG): container finished" podID="e9ab1e31-bea9-4b23-899c-0b818c121f65" containerID="32cdd41f81602138845c90016f85b37554af18208fd8596a57f201cfd38bddd8" exitCode=0 Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.329241 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e0e3-account-create-update-gpntj" event={"ID":"e9ab1e31-bea9-4b23-899c-0b818c121f65","Type":"ContainerDied","Data":"32cdd41f81602138845c90016f85b37554af18208fd8596a57f201cfd38bddd8"} Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.331512 4918 generic.go:334] "Generic (PLEG): container finished" podID="543581b9-3f29-4718-bf4b-a4eaa3fb4b39" containerID="0e116879497271fbf86f1c3d662d1b8e1ea280943fe7c54b7ecb303deb4589cc" exitCode=0 Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.331675 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5103-account-create-update-t64cp" event={"ID":"543581b9-3f29-4718-bf4b-a4eaa3fb4b39","Type":"ContainerDied","Data":"0e116879497271fbf86f1c3d662d1b8e1ea280943fe7c54b7ecb303deb4589cc"} Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.333664 4918 generic.go:334] "Generic (PLEG): container finished" podID="86c8c8a7-6bd3-4169-88ac-9a5838c526c2" containerID="692d053ee950319107e100f82a93633073c285df427d10ad3d062c7e8cb668b6" exitCode=0 Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.333826 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ea0b-account-create-update-l66jx" event={"ID":"86c8c8a7-6bd3-4169-88ac-9a5838c526c2","Type":"ContainerDied","Data":"692d053ee950319107e100f82a93633073c285df427d10ad3d062c7e8cb668b6"} Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.665899 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5103-account-create-update-t64cp" Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.788093 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-operator-scripts\") pod \"543581b9-3f29-4718-bf4b-a4eaa3fb4b39\" (UID: \"543581b9-3f29-4718-bf4b-a4eaa3fb4b39\") " Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.788253 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj9k5\" (UniqueName: \"kubernetes.io/projected/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-kube-api-access-zj9k5\") pod \"543581b9-3f29-4718-bf4b-a4eaa3fb4b39\" (UID: \"543581b9-3f29-4718-bf4b-a4eaa3fb4b39\") " Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.789462 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "543581b9-3f29-4718-bf4b-a4eaa3fb4b39" (UID: "543581b9-3f29-4718-bf4b-a4eaa3fb4b39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.790285 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.804974 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-kube-api-access-zj9k5" (OuterVolumeSpecName: "kube-api-access-zj9k5") pod "543581b9-3f29-4718-bf4b-a4eaa3fb4b39" (UID: "543581b9-3f29-4718-bf4b-a4eaa3fb4b39"). InnerVolumeSpecName "kube-api-access-zj9k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.892127 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj9k5\" (UniqueName: \"kubernetes.io/projected/543581b9-3f29-4718-bf4b-a4eaa3fb4b39-kube-api-access-zj9k5\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.914824 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e0e3-account-create-update-gpntj" Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.924973 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zk5wz" Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.993458 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9f94\" (UniqueName: \"kubernetes.io/projected/e9ab1e31-bea9-4b23-899c-0b818c121f65-kube-api-access-z9f94\") pod \"e9ab1e31-bea9-4b23-899c-0b818c121f65\" (UID: \"e9ab1e31-bea9-4b23-899c-0b818c121f65\") " Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.993632 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab1e31-bea9-4b23-899c-0b818c121f65-operator-scripts\") pod \"e9ab1e31-bea9-4b23-899c-0b818c121f65\" (UID: \"e9ab1e31-bea9-4b23-899c-0b818c121f65\") " Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.994686 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ab1e31-bea9-4b23-899c-0b818c121f65-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9ab1e31-bea9-4b23-899c-0b818c121f65" (UID: "e9ab1e31-bea9-4b23-899c-0b818c121f65"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:44 crc kubenswrapper[4918]: I0319 16:59:44.998795 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ab1e31-bea9-4b23-899c-0b818c121f65-kube-api-access-z9f94" (OuterVolumeSpecName: "kube-api-access-z9f94") pod "e9ab1e31-bea9-4b23-899c-0b818c121f65" (UID: "e9ab1e31-bea9-4b23-899c-0b818c121f65"). InnerVolumeSpecName "kube-api-access-z9f94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.095476 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25tk7\" (UniqueName: \"kubernetes.io/projected/ad8cc411-f838-439a-9993-e53b431dcd28-kube-api-access-25tk7\") pod \"ad8cc411-f838-439a-9993-e53b431dcd28\" (UID: \"ad8cc411-f838-439a-9993-e53b431dcd28\") " Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.095698 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad8cc411-f838-439a-9993-e53b431dcd28-operator-scripts\") pod \"ad8cc411-f838-439a-9993-e53b431dcd28\" (UID: \"ad8cc411-f838-439a-9993-e53b431dcd28\") " Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.096124 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9f94\" (UniqueName: \"kubernetes.io/projected/e9ab1e31-bea9-4b23-899c-0b818c121f65-kube-api-access-z9f94\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.096146 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab1e31-bea9-4b23-899c-0b818c121f65-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.096644 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad8cc411-f838-439a-9993-e53b431dcd28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad8cc411-f838-439a-9993-e53b431dcd28" (UID: "ad8cc411-f838-439a-9993-e53b431dcd28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.100046 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad8cc411-f838-439a-9993-e53b431dcd28-kube-api-access-25tk7" (OuterVolumeSpecName: "kube-api-access-25tk7") pod "ad8cc411-f838-439a-9993-e53b431dcd28" (UID: "ad8cc411-f838-439a-9993-e53b431dcd28"). InnerVolumeSpecName "kube-api-access-25tk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.197800 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad8cc411-f838-439a-9993-e53b431dcd28-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.197847 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25tk7\" (UniqueName: \"kubernetes.io/projected/ad8cc411-f838-439a-9993-e53b431dcd28-kube-api-access-25tk7\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.345873 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zk5wz" event={"ID":"ad8cc411-f838-439a-9993-e53b431dcd28","Type":"ContainerDied","Data":"39c19c099c7c62c057255e1d1688b4a4bdca921feac48a75c9b195f8fa87f7be"} Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.345923 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39c19c099c7c62c057255e1d1688b4a4bdca921feac48a75c9b195f8fa87f7be" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.345989 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zk5wz" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.367564 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e0e3-account-create-update-gpntj" event={"ID":"e9ab1e31-bea9-4b23-899c-0b818c121f65","Type":"ContainerDied","Data":"ca30aff1a4c1d8968175cff119c4e34b76ce9f723f7e428142b02bd619359277"} Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.367601 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e0e3-account-create-update-gpntj" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.367610 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca30aff1a4c1d8968175cff119c4e34b76ce9f723f7e428142b02bd619359277" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.382098 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5103-account-create-update-t64cp" Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.382311 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5103-account-create-update-t64cp" event={"ID":"543581b9-3f29-4718-bf4b-a4eaa3fb4b39","Type":"ContainerDied","Data":"5dcab61106bbcf7f888f8ccac38e37043a8ebe79a705cc42cfc1a1de707d5c6d"} Mar 19 16:59:45 crc kubenswrapper[4918]: I0319 16:59:45.382357 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dcab61106bbcf7f888f8ccac38e37043a8ebe79a705cc42cfc1a1de707d5c6d" Mar 19 16:59:47 crc kubenswrapper[4918]: I0319 16:59:47.401187 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce548ee3-59a9-46f9-8b00-06d380b17566","Type":"ContainerStarted","Data":"3d8466279f6c36b8fa743cf0f4679f277cbbb3944df440719ca49e6ec95da74e"} Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.241181 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-54nhz" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.260015 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6x9dg" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.284343 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ea0b-account-create-update-l66jx" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.356108 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfjbw\" (UniqueName: \"kubernetes.io/projected/dadbc2dd-325c-4390-9ea6-bc827cec049d-kube-api-access-gfjbw\") pod \"dadbc2dd-325c-4390-9ea6-bc827cec049d\" (UID: \"dadbc2dd-325c-4390-9ea6-bc827cec049d\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.356247 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-operator-scripts\") pod \"c909d8df-c118-4ff6-8c06-c0d3f71be4cf\" (UID: \"c909d8df-c118-4ff6-8c06-c0d3f71be4cf\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.357173 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c909d8df-c118-4ff6-8c06-c0d3f71be4cf" (UID: "c909d8df-c118-4ff6-8c06-c0d3f71be4cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.357244 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-operator-scripts\") pod \"86c8c8a7-6bd3-4169-88ac-9a5838c526c2\" (UID: \"86c8c8a7-6bd3-4169-88ac-9a5838c526c2\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.358358 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86c8c8a7-6bd3-4169-88ac-9a5838c526c2" (UID: "86c8c8a7-6bd3-4169-88ac-9a5838c526c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.358418 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadbc2dd-325c-4390-9ea6-bc827cec049d-operator-scripts\") pod \"dadbc2dd-325c-4390-9ea6-bc827cec049d\" (UID: \"dadbc2dd-325c-4390-9ea6-bc827cec049d\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.358448 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szz8f\" (UniqueName: \"kubernetes.io/projected/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-kube-api-access-szz8f\") pod \"c909d8df-c118-4ff6-8c06-c0d3f71be4cf\" (UID: \"c909d8df-c118-4ff6-8c06-c0d3f71be4cf\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.359092 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadbc2dd-325c-4390-9ea6-bc827cec049d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dadbc2dd-325c-4390-9ea6-bc827cec049d" (UID: "dadbc2dd-325c-4390-9ea6-bc827cec049d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.359190 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxqzc\" (UniqueName: \"kubernetes.io/projected/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-kube-api-access-jxqzc\") pod \"86c8c8a7-6bd3-4169-88ac-9a5838c526c2\" (UID: \"86c8c8a7-6bd3-4169-88ac-9a5838c526c2\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.359913 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.359933 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.359942 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadbc2dd-325c-4390-9ea6-bc827cec049d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.362898 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-kube-api-access-jxqzc" (OuterVolumeSpecName: "kube-api-access-jxqzc") pod "86c8c8a7-6bd3-4169-88ac-9a5838c526c2" (UID: "86c8c8a7-6bd3-4169-88ac-9a5838c526c2"). InnerVolumeSpecName "kube-api-access-jxqzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.363303 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadbc2dd-325c-4390-9ea6-bc827cec049d-kube-api-access-gfjbw" (OuterVolumeSpecName: "kube-api-access-gfjbw") pod "dadbc2dd-325c-4390-9ea6-bc827cec049d" (UID: "dadbc2dd-325c-4390-9ea6-bc827cec049d"). InnerVolumeSpecName "kube-api-access-gfjbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.363423 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-kube-api-access-szz8f" (OuterVolumeSpecName: "kube-api-access-szz8f") pod "c909d8df-c118-4ff6-8c06-c0d3f71be4cf" (UID: "c909d8df-c118-4ff6-8c06-c0d3f71be4cf"). InnerVolumeSpecName "kube-api-access-szz8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.415795 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ea0b-account-create-update-l66jx" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.415806 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ea0b-account-create-update-l66jx" event={"ID":"86c8c8a7-6bd3-4169-88ac-9a5838c526c2","Type":"ContainerDied","Data":"2f0afe17fd38271fa7312c6eef9941d197b86c37e35352dab96baf772912cee6"} Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.415852 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f0afe17fd38271fa7312c6eef9941d197b86c37e35352dab96baf772912cee6" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.418653 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-54nhz" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.418655 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-54nhz" event={"ID":"dadbc2dd-325c-4390-9ea6-bc827cec049d","Type":"ContainerDied","Data":"bba8e74590413a891c4037d64a1af79acf87244e8504184a91b5a207fae19bdb"} Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.418766 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba8e74590413a891c4037d64a1af79acf87244e8504184a91b5a207fae19bdb" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.420628 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6x9dg" event={"ID":"c909d8df-c118-4ff6-8c06-c0d3f71be4cf","Type":"ContainerDied","Data":"6ec62d0e2d6d439e756d5a69ce44602d655fb2f6e15cc14de32c6aead61b5814"} Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.420661 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ec62d0e2d6d439e756d5a69ce44602d655fb2f6e15cc14de32c6aead61b5814" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.420694 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6x9dg" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.462114 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szz8f\" (UniqueName: \"kubernetes.io/projected/c909d8df-c118-4ff6-8c06-c0d3f71be4cf-kube-api-access-szz8f\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.462146 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxqzc\" (UniqueName: \"kubernetes.io/projected/86c8c8a7-6bd3-4169-88ac-9a5838c526c2-kube-api-access-jxqzc\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.462174 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfjbw\" (UniqueName: \"kubernetes.io/projected/dadbc2dd-325c-4390-9ea6-bc827cec049d-kube-api-access-gfjbw\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.647562 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-g2mz7" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.656319 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-da83-account-create-update-xc7wt" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.666395 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.767563 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fda7a707-ab32-455f-8d42-bd371c95e9d2-operator-scripts\") pod \"fda7a707-ab32-455f-8d42-bd371c95e9d2\" (UID: \"fda7a707-ab32-455f-8d42-bd371c95e9d2\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.767868 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-config-data\") pod \"bc559952-1f04-4a21-8415-c9c613c5b4d4\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.767922 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7fvg\" (UniqueName: \"kubernetes.io/projected/0f77a167-549a-441f-b185-977ccb2195ab-kube-api-access-f7fvg\") pod \"0f77a167-549a-441f-b185-977ccb2195ab\" (UID: \"0f77a167-549a-441f-b185-977ccb2195ab\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.767960 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpg2m\" (UniqueName: \"kubernetes.io/projected/bc559952-1f04-4a21-8415-c9c613c5b4d4-kube-api-access-rpg2m\") pod \"bc559952-1f04-4a21-8415-c9c613c5b4d4\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.767984 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f77a167-549a-441f-b185-977ccb2195ab-operator-scripts\") pod \"0f77a167-549a-441f-b185-977ccb2195ab\" (UID: \"0f77a167-549a-441f-b185-977ccb2195ab\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.768034 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbd67\" (UniqueName: \"kubernetes.io/projected/fda7a707-ab32-455f-8d42-bd371c95e9d2-kube-api-access-lbd67\") pod \"fda7a707-ab32-455f-8d42-bd371c95e9d2\" (UID: \"fda7a707-ab32-455f-8d42-bd371c95e9d2\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.768106 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-db-sync-config-data\") pod \"bc559952-1f04-4a21-8415-c9c613c5b4d4\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.768195 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-combined-ca-bundle\") pod \"bc559952-1f04-4a21-8415-c9c613c5b4d4\" (UID: \"bc559952-1f04-4a21-8415-c9c613c5b4d4\") " Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.768418 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda7a707-ab32-455f-8d42-bd371c95e9d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fda7a707-ab32-455f-8d42-bd371c95e9d2" (UID: "fda7a707-ab32-455f-8d42-bd371c95e9d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.768746 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fda7a707-ab32-455f-8d42-bd371c95e9d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.772172 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda7a707-ab32-455f-8d42-bd371c95e9d2-kube-api-access-lbd67" (OuterVolumeSpecName: "kube-api-access-lbd67") pod "fda7a707-ab32-455f-8d42-bd371c95e9d2" (UID: "fda7a707-ab32-455f-8d42-bd371c95e9d2"). InnerVolumeSpecName "kube-api-access-lbd67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.772233 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f77a167-549a-441f-b185-977ccb2195ab-kube-api-access-f7fvg" (OuterVolumeSpecName: "kube-api-access-f7fvg") pod "0f77a167-549a-441f-b185-977ccb2195ab" (UID: "0f77a167-549a-441f-b185-977ccb2195ab"). InnerVolumeSpecName "kube-api-access-f7fvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.773039 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f77a167-549a-441f-b185-977ccb2195ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f77a167-549a-441f-b185-977ccb2195ab" (UID: "0f77a167-549a-441f-b185-977ccb2195ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.773690 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc559952-1f04-4a21-8415-c9c613c5b4d4-kube-api-access-rpg2m" (OuterVolumeSpecName: "kube-api-access-rpg2m") pod "bc559952-1f04-4a21-8415-c9c613c5b4d4" (UID: "bc559952-1f04-4a21-8415-c9c613c5b4d4"). InnerVolumeSpecName "kube-api-access-rpg2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.775334 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bc559952-1f04-4a21-8415-c9c613c5b4d4" (UID: "bc559952-1f04-4a21-8415-c9c613c5b4d4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.803298 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc559952-1f04-4a21-8415-c9c613c5b4d4" (UID: "bc559952-1f04-4a21-8415-c9c613c5b4d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.837431 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-config-data" (OuterVolumeSpecName: "config-data") pod "bc559952-1f04-4a21-8415-c9c613c5b4d4" (UID: "bc559952-1f04-4a21-8415-c9c613c5b4d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.870765 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpg2m\" (UniqueName: \"kubernetes.io/projected/bc559952-1f04-4a21-8415-c9c613c5b4d4-kube-api-access-rpg2m\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.870791 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f77a167-549a-441f-b185-977ccb2195ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.870800 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbd67\" (UniqueName: \"kubernetes.io/projected/fda7a707-ab32-455f-8d42-bd371c95e9d2-kube-api-access-lbd67\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.870808 4918 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.870816 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.870824 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc559952-1f04-4a21-8415-c9c613c5b4d4-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:48 crc kubenswrapper[4918]: I0319 16:59:48.870832 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7fvg\" (UniqueName: \"kubernetes.io/projected/0f77a167-549a-441f-b185-977ccb2195ab-kube-api-access-f7fvg\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.453815 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-da83-account-create-update-xc7wt" event={"ID":"0f77a167-549a-441f-b185-977ccb2195ab","Type":"ContainerDied","Data":"a6500f832f9bf3404482ba66a81f1cefaccc5af3f1df18e484d871a0b393b8dd"} Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.454103 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6500f832f9bf3404482ba66a81f1cefaccc5af3f1df18e484d871a0b393b8dd" Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.454157 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-da83-account-create-update-xc7wt" Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.457998 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ce548ee3-59a9-46f9-8b00-06d380b17566","Type":"ContainerStarted","Data":"d0161de22eb962797d62ea84d25a50da5e904f0f2a67e340e95e89359150c839"} Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.460701 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-g2mz7" event={"ID":"fda7a707-ab32-455f-8d42-bd371c95e9d2","Type":"ContainerDied","Data":"e52cb3bd47696542068deac6ce92eb03b0a7fdf3510aeed027cc08511583b024"} Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.460729 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e52cb3bd47696542068deac6ce92eb03b0a7fdf3510aeed027cc08511583b024" Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.460792 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-g2mz7" Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.470442 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-v6c2z" event={"ID":"bc559952-1f04-4a21-8415-c9c613c5b4d4","Type":"ContainerDied","Data":"589062f570b9ecda37eae83a531e5b6ddaa67bb4834946beb92a92f8875f15c8"} Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.470484 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="589062f570b9ecda37eae83a531e5b6ddaa67bb4834946beb92a92f8875f15c8" Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.470489 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-v6c2z" Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.475290 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"d6c2bf351ebcf0e5651aee236df17edacc265ce2bca4daf1df8d169d382930a3"} Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.475332 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"8b3c3f6edd60e18bb78b4b4d06b37b840191bbf2b54cbeb2c5e63031cd079865"} Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.477511 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6cdsd" event={"ID":"70226cc0-a6ae-4454-8e20-f85b06e2ee2d","Type":"ContainerStarted","Data":"4a68e2045d7ffc6b18fe5c2a035a1e8fb46086735192b921ce34ba35bbd807ed"} Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.518305 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.518283473 podStartE2EDuration="16.518283473s" podCreationTimestamp="2026-03-19 16:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:49.512431742 +0000 UTC m=+1201.634630990" watchObservedRunningTime="2026-03-19 16:59:49.518283473 +0000 UTC m=+1201.640482731" Mar 19 16:59:49 crc kubenswrapper[4918]: I0319 16:59:49.557729 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-6cdsd" podStartSLOduration=2.757158145 podStartE2EDuration="9.557707038s" podCreationTimestamp="2026-03-19 16:59:40 +0000 UTC" firstStartedPulling="2026-03-19 16:59:41.875500437 +0000 UTC m=+1193.997699685" lastFinishedPulling="2026-03-19 16:59:48.67604932 +0000 UTC m=+1200.798248578" observedRunningTime="2026-03-19 16:59:49.537761969 +0000 UTC m=+1201.659961207" watchObservedRunningTime="2026-03-19 16:59:49.557707038 +0000 UTC m=+1201.679906286" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.124191 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-scqmt"] Mar 19 16:59:50 crc kubenswrapper[4918]: E0319 16:59:50.124640 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c909d8df-c118-4ff6-8c06-c0d3f71be4cf" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.124656 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="c909d8df-c118-4ff6-8c06-c0d3f71be4cf" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: E0319 16:59:50.124678 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8cc411-f838-439a-9993-e53b431dcd28" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.124687 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8cc411-f838-439a-9993-e53b431dcd28" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: E0319 16:59:50.124702 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ab1e31-bea9-4b23-899c-0b818c121f65" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.124710 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ab1e31-bea9-4b23-899c-0b818c121f65" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: E0319 16:59:50.124719 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543581b9-3f29-4718-bf4b-a4eaa3fb4b39" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.124726 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="543581b9-3f29-4718-bf4b-a4eaa3fb4b39" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: E0319 16:59:50.124741 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f77a167-549a-441f-b185-977ccb2195ab" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.124747 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f77a167-549a-441f-b185-977ccb2195ab" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: E0319 16:59:50.124762 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c8c8a7-6bd3-4169-88ac-9a5838c526c2" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.124770 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c8c8a7-6bd3-4169-88ac-9a5838c526c2" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: E0319 16:59:50.124780 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadbc2dd-325c-4390-9ea6-bc827cec049d" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.124787 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadbc2dd-325c-4390-9ea6-bc827cec049d" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: E0319 16:59:50.124812 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda7a707-ab32-455f-8d42-bd371c95e9d2" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.124820 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda7a707-ab32-455f-8d42-bd371c95e9d2" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: E0319 16:59:50.124836 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc559952-1f04-4a21-8415-c9c613c5b4d4" containerName="glance-db-sync" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.124843 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc559952-1f04-4a21-8415-c9c613c5b4d4" containerName="glance-db-sync" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.125090 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f77a167-549a-441f-b185-977ccb2195ab" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.125106 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="fda7a707-ab32-455f-8d42-bd371c95e9d2" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.125130 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc559952-1f04-4a21-8415-c9c613c5b4d4" containerName="glance-db-sync" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.125142 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadbc2dd-325c-4390-9ea6-bc827cec049d" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.125154 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8cc411-f838-439a-9993-e53b431dcd28" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.125174 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c8c8a7-6bd3-4169-88ac-9a5838c526c2" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.125190 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="c909d8df-c118-4ff6-8c06-c0d3f71be4cf" containerName="mariadb-database-create" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.125199 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="543581b9-3f29-4718-bf4b-a4eaa3fb4b39" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.125214 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ab1e31-bea9-4b23-899c-0b818c121f65" containerName="mariadb-account-create-update" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.126322 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.143038 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-scqmt"] Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.216561 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckl5\" (UniqueName: \"kubernetes.io/projected/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-kube-api-access-gckl5\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.216655 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.216686 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.216729 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.216798 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-config\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.318741 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gckl5\" (UniqueName: \"kubernetes.io/projected/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-kube-api-access-gckl5\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.318838 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.318866 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.318890 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.318956 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-config\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.320053 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.320190 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.320213 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-config\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.320259 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.341045 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gckl5\" (UniqueName: \"kubernetes.io/projected/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-kube-api-access-gckl5\") pod \"dnsmasq-dns-5b946c75cc-scqmt\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.497837 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"3c850a1f34673c173ae6bc7f859e4a0d77d92986d15a815769962bacae82505c"} Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.497883 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"90386805f80e72400b0277696e19e11574b5f2f0736d641c145966e3230c77ce"} Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.497894 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"2526f7643528b35220c559f0ee7b0582703f626d30a4391a255dc32eaf5f08d4"} Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.497903 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"e121c5f6e9805874277b253548235d6a4fd4ea86ba4a2555abaf4420c9b100a2"} Mar 19 16:59:50 crc kubenswrapper[4918]: I0319 16:59:50.530060 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:51 crc kubenswrapper[4918]: I0319 16:59:51.001321 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-scqmt"] Mar 19 16:59:51 crc kubenswrapper[4918]: W0319 16:59:51.015062 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b310bb0_a7b6_422f_824c_5ce7cf1e5630.slice/crio-69839a077d08139399d74ec71e36290d0f45069ade39768f5a89f52b92c9d5cc WatchSource:0}: Error finding container 69839a077d08139399d74ec71e36290d0f45069ade39768f5a89f52b92c9d5cc: Status 404 returned error can't find the container with id 69839a077d08139399d74ec71e36290d0f45069ade39768f5a89f52b92c9d5cc Mar 19 16:59:51 crc kubenswrapper[4918]: I0319 16:59:51.508331 4918 generic.go:334] "Generic (PLEG): container finished" podID="9b310bb0-a7b6-422f-824c-5ce7cf1e5630" containerID="8c0e768adcc478fa8526d57ac1206aace0f0a38106e6a25ffa290204b56060c0" exitCode=0 Mar 19 16:59:51 crc kubenswrapper[4918]: I0319 16:59:51.508412 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" event={"ID":"9b310bb0-a7b6-422f-824c-5ce7cf1e5630","Type":"ContainerDied","Data":"8c0e768adcc478fa8526d57ac1206aace0f0a38106e6a25ffa290204b56060c0"} Mar 19 16:59:51 crc kubenswrapper[4918]: I0319 16:59:51.508464 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" event={"ID":"9b310bb0-a7b6-422f-824c-5ce7cf1e5630","Type":"ContainerStarted","Data":"69839a077d08139399d74ec71e36290d0f45069ade39768f5a89f52b92c9d5cc"} Mar 19 16:59:51 crc kubenswrapper[4918]: I0319 16:59:51.513880 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e4118384-38ad-465d-a81e-62bf39cc6cec","Type":"ContainerStarted","Data":"6c1eecf453f454644c2f9f23ef21ddbd53b82f168ea9f22d92196a7a1ccce300"} Mar 19 16:59:51 crc kubenswrapper[4918]: I0319 16:59:51.556474 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.821271786 podStartE2EDuration="48.556428943s" podCreationTimestamp="2026-03-19 16:59:03 +0000 UTC" firstStartedPulling="2026-03-19 16:59:36.971141146 +0000 UTC m=+1189.093340394" lastFinishedPulling="2026-03-19 16:59:48.706298303 +0000 UTC m=+1200.828497551" observedRunningTime="2026-03-19 16:59:51.547774395 +0000 UTC m=+1203.669973653" watchObservedRunningTime="2026-03-19 16:59:51.556428943 +0000 UTC m=+1203.678628201" Mar 19 16:59:51 crc kubenswrapper[4918]: I0319 16:59:51.994447 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-scqmt"] Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.020107 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2jnlk"] Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.021787 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.027976 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.036802 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2jnlk"] Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.153418 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.153537 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.153618 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkc4j\" (UniqueName: \"kubernetes.io/projected/32139bd7-916e-4cfc-a6c3-4de222246896-kube-api-access-tkc4j\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.153653 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-config\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.153710 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.153736 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.255905 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.255977 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkc4j\" (UniqueName: \"kubernetes.io/projected/32139bd7-916e-4cfc-a6c3-4de222246896-kube-api-access-tkc4j\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.256013 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-config\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.256068 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.256095 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.256181 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.257087 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.257185 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-config\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.257201 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.257458 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.257474 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.288597 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkc4j\" (UniqueName: \"kubernetes.io/projected/32139bd7-916e-4cfc-a6c3-4de222246896-kube-api-access-tkc4j\") pod \"dnsmasq-dns-74f6bcbc87-2jnlk\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.340735 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:52 crc kubenswrapper[4918]: W0319 16:59:52.858691 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32139bd7_916e_4cfc_a6c3_4de222246896.slice/crio-6eb478ba0798a5860798563940ba10864cea0a44cbe4f13f8999a9f25a614712 WatchSource:0}: Error finding container 6eb478ba0798a5860798563940ba10864cea0a44cbe4f13f8999a9f25a614712: Status 404 returned error can't find the container with id 6eb478ba0798a5860798563940ba10864cea0a44cbe4f13f8999a9f25a614712 Mar 19 16:59:52 crc kubenswrapper[4918]: I0319 16:59:52.859096 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2jnlk"] Mar 19 16:59:53 crc kubenswrapper[4918]: I0319 16:59:53.445776 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 19 16:59:53 crc kubenswrapper[4918]: I0319 16:59:53.557635 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" event={"ID":"9b310bb0-a7b6-422f-824c-5ce7cf1e5630","Type":"ContainerStarted","Data":"0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61"} Mar 19 16:59:53 crc kubenswrapper[4918]: I0319 16:59:53.557817 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" podUID="9b310bb0-a7b6-422f-824c-5ce7cf1e5630" containerName="dnsmasq-dns" containerID="cri-o://0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61" gracePeriod=10 Mar 19 16:59:53 crc kubenswrapper[4918]: I0319 16:59:53.557876 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:53 crc kubenswrapper[4918]: I0319 16:59:53.565035 4918 generic.go:334] "Generic (PLEG): container finished" podID="32139bd7-916e-4cfc-a6c3-4de222246896" containerID="79eefb244158bfb361cc096b4d0135b36dcf78c0c6d0888034ac8ce2b1d3ac98" exitCode=0 Mar 19 16:59:53 crc kubenswrapper[4918]: I0319 16:59:53.565132 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" event={"ID":"32139bd7-916e-4cfc-a6c3-4de222246896","Type":"ContainerDied","Data":"79eefb244158bfb361cc096b4d0135b36dcf78c0c6d0888034ac8ce2b1d3ac98"} Mar 19 16:59:53 crc kubenswrapper[4918]: I0319 16:59:53.565185 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" event={"ID":"32139bd7-916e-4cfc-a6c3-4de222246896","Type":"ContainerStarted","Data":"6eb478ba0798a5860798563940ba10864cea0a44cbe4f13f8999a9f25a614712"} Mar 19 16:59:53 crc kubenswrapper[4918]: I0319 16:59:53.568780 4918 generic.go:334] "Generic (PLEG): container finished" podID="70226cc0-a6ae-4454-8e20-f85b06e2ee2d" containerID="4a68e2045d7ffc6b18fe5c2a035a1e8fb46086735192b921ce34ba35bbd807ed" exitCode=0 Mar 19 16:59:53 crc kubenswrapper[4918]: I0319 16:59:53.568843 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6cdsd" event={"ID":"70226cc0-a6ae-4454-8e20-f85b06e2ee2d","Type":"ContainerDied","Data":"4a68e2045d7ffc6b18fe5c2a035a1e8fb46086735192b921ce34ba35bbd807ed"} Mar 19 16:59:53 crc kubenswrapper[4918]: I0319 16:59:53.601108 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" podStartSLOduration=3.601057881 podStartE2EDuration="3.601057881s" podCreationTimestamp="2026-03-19 16:59:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:53.590767388 +0000 UTC m=+1205.712966636" watchObservedRunningTime="2026-03-19 16:59:53.601057881 +0000 UTC m=+1205.723257129" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.031861 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.109083 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-sb\") pod \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.109247 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-dns-svc\") pod \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.109358 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gckl5\" (UniqueName: \"kubernetes.io/projected/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-kube-api-access-gckl5\") pod \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.109504 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-config\") pod \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.109628 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-nb\") pod \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\" (UID: \"9b310bb0-a7b6-422f-824c-5ce7cf1e5630\") " Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.115809 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-kube-api-access-gckl5" (OuterVolumeSpecName: "kube-api-access-gckl5") pod "9b310bb0-a7b6-422f-824c-5ce7cf1e5630" (UID: "9b310bb0-a7b6-422f-824c-5ce7cf1e5630"). InnerVolumeSpecName "kube-api-access-gckl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.172823 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-config" (OuterVolumeSpecName: "config") pod "9b310bb0-a7b6-422f-824c-5ce7cf1e5630" (UID: "9b310bb0-a7b6-422f-824c-5ce7cf1e5630"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.213782 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gckl5\" (UniqueName: \"kubernetes.io/projected/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-kube-api-access-gckl5\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.214542 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.214541 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b310bb0-a7b6-422f-824c-5ce7cf1e5630" (UID: "9b310bb0-a7b6-422f-824c-5ce7cf1e5630"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.215866 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b310bb0-a7b6-422f-824c-5ce7cf1e5630" (UID: "9b310bb0-a7b6-422f-824c-5ce7cf1e5630"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.253327 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9b310bb0-a7b6-422f-824c-5ce7cf1e5630" (UID: "9b310bb0-a7b6-422f-824c-5ce7cf1e5630"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.316209 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.316244 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.316254 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b310bb0-a7b6-422f-824c-5ce7cf1e5630-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.577645 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" event={"ID":"32139bd7-916e-4cfc-a6c3-4de222246896","Type":"ContainerStarted","Data":"f6bb3323a03f9f99f47fe65c023f57cde0eea0868ef1731ea9cb23c370f9cfa8"} Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.578149 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.580130 4918 generic.go:334] "Generic (PLEG): container finished" podID="9b310bb0-a7b6-422f-824c-5ce7cf1e5630" containerID="0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61" exitCode=0 Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.580178 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.580190 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" event={"ID":"9b310bb0-a7b6-422f-824c-5ce7cf1e5630","Type":"ContainerDied","Data":"0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61"} Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.580380 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-scqmt" event={"ID":"9b310bb0-a7b6-422f-824c-5ce7cf1e5630","Type":"ContainerDied","Data":"69839a077d08139399d74ec71e36290d0f45069ade39768f5a89f52b92c9d5cc"} Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.580406 4918 scope.go:117] "RemoveContainer" containerID="0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.604925 4918 scope.go:117] "RemoveContainer" containerID="8c0e768adcc478fa8526d57ac1206aace0f0a38106e6a25ffa290204b56060c0" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.612614 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" podStartSLOduration=3.612593842 podStartE2EDuration="3.612593842s" podCreationTimestamp="2026-03-19 16:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:54.602964968 +0000 UTC m=+1206.725164216" watchObservedRunningTime="2026-03-19 16:59:54.612593842 +0000 UTC m=+1206.734793090" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.634921 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-scqmt"] Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.636336 4918 scope.go:117] "RemoveContainer" containerID="0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61" Mar 19 16:59:54 crc kubenswrapper[4918]: E0319 16:59:54.636962 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61\": container with ID starting with 0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61 not found: ID does not exist" containerID="0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.637003 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61"} err="failed to get container status \"0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61\": rpc error: code = NotFound desc = could not find container \"0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61\": container with ID starting with 0094f6f7fd2e192cf551b51559d2d3aa64f2e1429fcc7af3712ed8b247375c61 not found: ID does not exist" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.637024 4918 scope.go:117] "RemoveContainer" containerID="8c0e768adcc478fa8526d57ac1206aace0f0a38106e6a25ffa290204b56060c0" Mar 19 16:59:54 crc kubenswrapper[4918]: E0319 16:59:54.637294 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0e768adcc478fa8526d57ac1206aace0f0a38106e6a25ffa290204b56060c0\": container with ID starting with 8c0e768adcc478fa8526d57ac1206aace0f0a38106e6a25ffa290204b56060c0 not found: ID does not exist" containerID="8c0e768adcc478fa8526d57ac1206aace0f0a38106e6a25ffa290204b56060c0" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.637324 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0e768adcc478fa8526d57ac1206aace0f0a38106e6a25ffa290204b56060c0"} err="failed to get container status \"8c0e768adcc478fa8526d57ac1206aace0f0a38106e6a25ffa290204b56060c0\": rpc error: code = NotFound desc = could not find container \"8c0e768adcc478fa8526d57ac1206aace0f0a38106e6a25ffa290204b56060c0\": container with ID starting with 8c0e768adcc478fa8526d57ac1206aace0f0a38106e6a25ffa290204b56060c0 not found: ID does not exist" Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.646419 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-scqmt"] Mar 19 16:59:54 crc kubenswrapper[4918]: I0319 16:59:54.994760 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.131193 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-combined-ca-bundle\") pod \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.131384 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr5q2\" (UniqueName: \"kubernetes.io/projected/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-kube-api-access-vr5q2\") pod \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.131425 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-config-data\") pod \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\" (UID: \"70226cc0-a6ae-4454-8e20-f85b06e2ee2d\") " Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.138131 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-kube-api-access-vr5q2" (OuterVolumeSpecName: "kube-api-access-vr5q2") pod "70226cc0-a6ae-4454-8e20-f85b06e2ee2d" (UID: "70226cc0-a6ae-4454-8e20-f85b06e2ee2d"). InnerVolumeSpecName "kube-api-access-vr5q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.176963 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70226cc0-a6ae-4454-8e20-f85b06e2ee2d" (UID: "70226cc0-a6ae-4454-8e20-f85b06e2ee2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.179378 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-config-data" (OuterVolumeSpecName: "config-data") pod "70226cc0-a6ae-4454-8e20-f85b06e2ee2d" (UID: "70226cc0-a6ae-4454-8e20-f85b06e2ee2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.233265 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr5q2\" (UniqueName: \"kubernetes.io/projected/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-kube-api-access-vr5q2\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.233292 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.233302 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70226cc0-a6ae-4454-8e20-f85b06e2ee2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.593476 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6cdsd" event={"ID":"70226cc0-a6ae-4454-8e20-f85b06e2ee2d","Type":"ContainerDied","Data":"a8443c565be899627ac5ae2704ec94f7f1fb8c571d209713c890c8e891921ee2"} Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.593565 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8443c565be899627ac5ae2704ec94f7f1fb8c571d209713c890c8e891921ee2" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.593668 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6cdsd" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.936637 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4qpsq"] Mar 19 16:59:55 crc kubenswrapper[4918]: E0319 16:59:55.937035 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b310bb0-a7b6-422f-824c-5ce7cf1e5630" containerName="init" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.937050 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b310bb0-a7b6-422f-824c-5ce7cf1e5630" containerName="init" Mar 19 16:59:55 crc kubenswrapper[4918]: E0319 16:59:55.937062 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70226cc0-a6ae-4454-8e20-f85b06e2ee2d" containerName="keystone-db-sync" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.937069 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="70226cc0-a6ae-4454-8e20-f85b06e2ee2d" containerName="keystone-db-sync" Mar 19 16:59:55 crc kubenswrapper[4918]: E0319 16:59:55.937097 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b310bb0-a7b6-422f-824c-5ce7cf1e5630" containerName="dnsmasq-dns" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.937105 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b310bb0-a7b6-422f-824c-5ce7cf1e5630" containerName="dnsmasq-dns" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.937316 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b310bb0-a7b6-422f-824c-5ce7cf1e5630" containerName="dnsmasq-dns" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.937348 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="70226cc0-a6ae-4454-8e20-f85b06e2ee2d" containerName="keystone-db-sync" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.938067 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.940213 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d72v8" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.941446 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.941639 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.941790 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.941987 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.950581 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4qpsq"] Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.963882 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2jnlk"] Mar 19 16:59:55 crc kubenswrapper[4918]: I0319 16:59:55.999353 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zbhl6"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.000829 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.026782 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zbhl6"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057204 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057257 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt5k7\" (UniqueName: \"kubernetes.io/projected/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-kube-api-access-zt5k7\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057287 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057307 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-config-data\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057334 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-scripts\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057350 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057384 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-svc\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057409 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-fernet-keys\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057427 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-config\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057447 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd9j9\" (UniqueName: \"kubernetes.io/projected/ca0f97bf-56d6-4dec-9727-b1d406e048c7-kube-api-access-cd9j9\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057463 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-credential-keys\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.057534 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-combined-ca-bundle\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159183 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159229 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-config-data\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159257 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-scripts\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159277 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159319 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-svc\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159339 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-fernet-keys\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159355 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-config\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159378 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd9j9\" (UniqueName: \"kubernetes.io/projected/ca0f97bf-56d6-4dec-9727-b1d406e048c7-kube-api-access-cd9j9\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159396 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-credential-keys\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159458 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-combined-ca-bundle\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159494 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159534 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt5k7\" (UniqueName: \"kubernetes.io/projected/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-kube-api-access-zt5k7\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.159967 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.160557 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-config\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.161073 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.161399 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-svc\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.165797 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.166114 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-credential-keys\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.166218 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-config-data\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.166748 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-combined-ca-bundle\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.169746 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-scripts\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.178154 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-fernet-keys\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.204194 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt5k7\" (UniqueName: \"kubernetes.io/projected/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-kube-api-access-zt5k7\") pod \"dnsmasq-dns-847c4cc679-zbhl6\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.215142 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd9j9\" (UniqueName: \"kubernetes.io/projected/ca0f97bf-56d6-4dec-9727-b1d406e048c7-kube-api-access-cd9j9\") pod \"keystone-bootstrap-4qpsq\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.259452 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4qpsq" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.271903 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-99gbh"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.275423 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.277394 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rw857" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.278767 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.282923 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.299342 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-d9qg5"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.303157 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.308172 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-99gbh"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.314145 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.314341 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.314450 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-kxbs6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.314935 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.328500 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.349613 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-b5btd"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.350854 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.355912 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.356206 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.356319 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-h4t22" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.362766 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btqj4\" (UniqueName: \"kubernetes.io/projected/33152bb1-e526-420f-8dec-7ef80c68b47c-kube-api-access-btqj4\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.362801 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-config-data\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.362825 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-config-data\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.362841 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e24181b4-a2be-4ea7-9602-e3e16b8862c1-logs\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.362880 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-scripts\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.362905 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-db-sync-config-data\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.362923 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-scripts\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.362941 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-scripts\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.362962 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-combined-ca-bundle\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.363004 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-combined-ca-bundle\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.363037 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsx7l\" (UniqueName: \"kubernetes.io/projected/e24181b4-a2be-4ea7-9602-e3e16b8862c1-kube-api-access-hsx7l\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.363083 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-combined-ca-bundle\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.363099 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt5j7\" (UniqueName: \"kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-kube-api-access-lt5j7\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.363115 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-config-data\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.363144 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-certs\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.363163 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33152bb1-e526-420f-8dec-7ef80c68b47c-etc-machine-id\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.371854 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zbhl6"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.418789 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-d9qg5"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.465867 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b5btd"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467597 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-scripts\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467637 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-db-sync-config-data\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467657 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-scripts\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467678 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-scripts\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467707 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-combined-ca-bundle\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467725 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-combined-ca-bundle\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467758 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsx7l\" (UniqueName: \"kubernetes.io/projected/e24181b4-a2be-4ea7-9602-e3e16b8862c1-kube-api-access-hsx7l\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467808 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-combined-ca-bundle\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467823 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5j7\" (UniqueName: \"kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-kube-api-access-lt5j7\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467841 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-config-data\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467870 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-certs\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467894 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33152bb1-e526-420f-8dec-7ef80c68b47c-etc-machine-id\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467930 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btqj4\" (UniqueName: \"kubernetes.io/projected/33152bb1-e526-420f-8dec-7ef80c68b47c-kube-api-access-btqj4\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467945 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-config-data\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467966 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-config-data\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.467983 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e24181b4-a2be-4ea7-9602-e3e16b8862c1-logs\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.468504 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e24181b4-a2be-4ea7-9602-e3e16b8862c1-logs\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.473809 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-combined-ca-bundle\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.485369 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33152bb1-e526-420f-8dec-7ef80c68b47c-etc-machine-id\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.498182 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-certs\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.501847 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-combined-ca-bundle\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.502067 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-scripts\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.504271 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-config-data\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.506950 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-scripts\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.507648 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-db-sync-config-data\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.510261 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-scripts\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.526629 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.529383 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btqj4\" (UniqueName: \"kubernetes.io/projected/33152bb1-e526-420f-8dec-7ef80c68b47c-kube-api-access-btqj4\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.529968 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsx7l\" (UniqueName: \"kubernetes.io/projected/e24181b4-a2be-4ea7-9602-e3e16b8862c1-kube-api-access-hsx7l\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.530098 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.532812 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-combined-ca-bundle\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.543389 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-config-data\") pod \"cinder-db-sync-b5btd\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.562174 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-config-data\") pod \"placement-db-sync-99gbh\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.563719 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.570397 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.580782 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spf6p\" (UniqueName: \"kubernetes.io/projected/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-kube-api-access-spf6p\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.582171 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-config-data\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.582296 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-run-httpd\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.582356 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-log-httpd\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.582434 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-scripts\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.582558 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.582659 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.640434 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" podUID="32139bd7-916e-4cfc-a6c3-4de222246896" containerName="dnsmasq-dns" containerID="cri-o://f6bb3323a03f9f99f47fe65c023f57cde0eea0868ef1731ea9cb23c370f9cfa8" gracePeriod=10 Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.647876 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt5j7\" (UniqueName: \"kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-kube-api-access-lt5j7\") pod \"cloudkitty-db-sync-d9qg5\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.656859 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b310bb0-a7b6-422f-824c-5ce7cf1e5630" path="/var/lib/kubelet/pods/9b310bb0-a7b6-422f-824c-5ce7cf1e5630/volumes" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.657420 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lnhw2"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.684265 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.687797 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.687893 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.687998 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.689076 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spf6p\" (UniqueName: \"kubernetes.io/projected/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-kube-api-access-spf6p\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.689308 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-config-data\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.689437 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-run-httpd\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.689499 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-log-httpd\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.690472 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-scripts\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.693225 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-run-httpd\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.693469 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-log-httpd\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.694792 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-scripts\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.709230 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.712321 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.712477 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.715287 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-config-data\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.732027 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b5btd" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.734299 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spf6p\" (UniqueName: \"kubernetes.io/projected/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-kube-api-access-spf6p\") pod \"ceilometer-0\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.755858 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-99gbh" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.775508 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.796610 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-s9rgt"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.798424 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s9rgt" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.801096 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.801153 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.801264 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz6ln\" (UniqueName: \"kubernetes.io/projected/1fdb44cf-8ddb-4561-8749-702ccf333279-kube-api-access-mz6ln\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.801310 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.801335 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-config\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.801417 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.805938 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.806309 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6zfw8" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.806439 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.810456 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wtk47"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.811728 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtk47" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.813281 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.814375 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qbhlj" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.829280 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lnhw2"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.967974 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-combined-ca-bundle\") pod \"neutron-db-sync-s9rgt\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " pod="openstack/neutron-db-sync-s9rgt" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.970345 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.970694 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.970955 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-config\") pod \"neutron-db-sync-s9rgt\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " pod="openstack/neutron-db-sync-s9rgt" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.980636 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.985259 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.985335 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wtk47"] Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.985741 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k96q5\" (UniqueName: \"kubernetes.io/projected/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-kube-api-access-k96q5\") pod \"barbican-db-sync-wtk47\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " pod="openstack/barbican-db-sync-wtk47" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.985838 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz6ln\" (UniqueName: \"kubernetes.io/projected/1fdb44cf-8ddb-4561-8749-702ccf333279-kube-api-access-mz6ln\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.985906 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.985967 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-config\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.986073 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-combined-ca-bundle\") pod \"barbican-db-sync-wtk47\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " pod="openstack/barbican-db-sync-wtk47" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.986187 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzn8b\" (UniqueName: \"kubernetes.io/projected/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-kube-api-access-gzn8b\") pod \"neutron-db-sync-s9rgt\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " pod="openstack/neutron-db-sync-s9rgt" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.986244 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:56 crc kubenswrapper[4918]: I0319 16:59:56.986401 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-db-sync-config-data\") pod \"barbican-db-sync-wtk47\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " pod="openstack/barbican-db-sync-wtk47" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.019830 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-config\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.033543 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.036199 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.065714 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz6ln\" (UniqueName: \"kubernetes.io/projected/1fdb44cf-8ddb-4561-8749-702ccf333279-kube-api-access-mz6ln\") pod \"dnsmasq-dns-785d8bcb8c-lnhw2\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.065827 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s9rgt"] Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.108306 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-combined-ca-bundle\") pod \"neutron-db-sync-s9rgt\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " pod="openstack/neutron-db-sync-s9rgt" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.108416 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-config\") pod \"neutron-db-sync-s9rgt\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " pod="openstack/neutron-db-sync-s9rgt" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.108462 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k96q5\" (UniqueName: \"kubernetes.io/projected/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-kube-api-access-k96q5\") pod \"barbican-db-sync-wtk47\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " pod="openstack/barbican-db-sync-wtk47" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.108562 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-combined-ca-bundle\") pod \"barbican-db-sync-wtk47\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " pod="openstack/barbican-db-sync-wtk47" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.108602 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzn8b\" (UniqueName: \"kubernetes.io/projected/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-kube-api-access-gzn8b\") pod \"neutron-db-sync-s9rgt\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " pod="openstack/neutron-db-sync-s9rgt" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.108649 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-db-sync-config-data\") pod \"barbican-db-sync-wtk47\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " pod="openstack/barbican-db-sync-wtk47" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.123870 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-combined-ca-bundle\") pod \"neutron-db-sync-s9rgt\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " pod="openstack/neutron-db-sync-s9rgt" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.165098 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.169997 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k96q5\" (UniqueName: \"kubernetes.io/projected/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-kube-api-access-k96q5\") pod \"barbican-db-sync-wtk47\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " pod="openstack/barbican-db-sync-wtk47" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.171088 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-config\") pod \"neutron-db-sync-s9rgt\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " pod="openstack/neutron-db-sync-s9rgt" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.171220 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzn8b\" (UniqueName: \"kubernetes.io/projected/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-kube-api-access-gzn8b\") pod \"neutron-db-sync-s9rgt\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " pod="openstack/neutron-db-sync-s9rgt" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.171541 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-combined-ca-bundle\") pod \"barbican-db-sync-wtk47\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " pod="openstack/barbican-db-sync-wtk47" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.173994 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-db-sync-config-data\") pod \"barbican-db-sync-wtk47\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " pod="openstack/barbican-db-sync-wtk47" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.194906 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.197180 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.198715 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s9rgt" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.203224 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4zksx" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.203360 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.203458 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.203726 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.210103 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.229000 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtk47" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.242895 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.245094 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.250462 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.250935 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.266671 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323435 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323503 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323566 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323623 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-scripts\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323692 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-logs\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323707 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323756 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6szj\" (UniqueName: \"kubernetes.io/projected/383704eb-7b74-4878-ae73-eb5a1f85a49f-kube-api-access-t6szj\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323844 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323862 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323889 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323924 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.323963 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-logs\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.324025 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.324046 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.324070 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg2cp\" (UniqueName: \"kubernetes.io/projected/3b04ffe9-4130-4428-ba7e-e41bed199d74-kube-api-access-rg2cp\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.324083 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-config-data\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.399140 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4qpsq"] Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.426717 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427096 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427129 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427156 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427183 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-logs\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427219 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427239 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427257 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg2cp\" (UniqueName: \"kubernetes.io/projected/3b04ffe9-4130-4428-ba7e-e41bed199d74-kube-api-access-rg2cp\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427272 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-config-data\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427308 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427332 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427354 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427381 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-scripts\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427413 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-logs\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427429 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.427463 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6szj\" (UniqueName: \"kubernetes.io/projected/383704eb-7b74-4878-ae73-eb5a1f85a49f-kube-api-access-t6szj\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.429772 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.439621 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-logs\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.442344 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.442376 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ea238a48943c00d3b8fe8315d317d6aa508a60b77f6685b492c061941b28c63f/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.442484 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.442552 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/342db21bb7c2e49b22a24134653a3d87a173d64abb89a0070323b0a8e0ff9956/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.447591 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.449586 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.449730 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.450183 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-logs\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.452001 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.453125 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-scripts\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.461965 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg2cp\" (UniqueName: \"kubernetes.io/projected/3b04ffe9-4130-4428-ba7e-e41bed199d74-kube-api-access-rg2cp\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.462655 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6szj\" (UniqueName: \"kubernetes.io/projected/383704eb-7b74-4878-ae73-eb5a1f85a49f-kube-api-access-t6szj\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.473174 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.473656 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.477119 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-config-data\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.503442 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.526219 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.547591 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.565468 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.603687 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.699925 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zbhl6"] Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.700845 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4qpsq" event={"ID":"ca0f97bf-56d6-4dec-9727-b1d406e048c7","Type":"ContainerStarted","Data":"5ba28ac6be3f209e6c54e2b6ef648cfc15b1b1e78e03b1bcb0d109471152c22e"} Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.712812 4918 generic.go:334] "Generic (PLEG): container finished" podID="32139bd7-916e-4cfc-a6c3-4de222246896" containerID="f6bb3323a03f9f99f47fe65c023f57cde0eea0868ef1731ea9cb23c370f9cfa8" exitCode=0 Mar 19 16:59:57 crc kubenswrapper[4918]: I0319 16:59:57.712848 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" event={"ID":"32139bd7-916e-4cfc-a6c3-4de222246896","Type":"ContainerDied","Data":"f6bb3323a03f9f99f47fe65c023f57cde0eea0868ef1731ea9cb23c370f9cfa8"} Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.156926 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.217163 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.217229 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.217280 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.219321 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d175bcf8fa8bff1bfb04d3a219eb7c4c6847a1adae22fbf62149bc4b8894f0f0"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.219396 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://d175bcf8fa8bff1bfb04d3a219eb7c4c6847a1adae22fbf62149bc4b8894f0f0" gracePeriod=600 Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.265968 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-sb\") pod \"32139bd7-916e-4cfc-a6c3-4de222246896\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.266552 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-nb\") pod \"32139bd7-916e-4cfc-a6c3-4de222246896\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.266721 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-config\") pod \"32139bd7-916e-4cfc-a6c3-4de222246896\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.266746 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-swift-storage-0\") pod \"32139bd7-916e-4cfc-a6c3-4de222246896\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.266772 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-svc\") pod \"32139bd7-916e-4cfc-a6c3-4de222246896\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.266814 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkc4j\" (UniqueName: \"kubernetes.io/projected/32139bd7-916e-4cfc-a6c3-4de222246896-kube-api-access-tkc4j\") pod \"32139bd7-916e-4cfc-a6c3-4de222246896\" (UID: \"32139bd7-916e-4cfc-a6c3-4de222246896\") " Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.270094 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-d9qg5"] Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.279160 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32139bd7-916e-4cfc-a6c3-4de222246896-kube-api-access-tkc4j" (OuterVolumeSpecName: "kube-api-access-tkc4j") pod "32139bd7-916e-4cfc-a6c3-4de222246896" (UID: "32139bd7-916e-4cfc-a6c3-4de222246896"). InnerVolumeSpecName "kube-api-access-tkc4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.306979 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b5btd"] Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.366986 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32139bd7-916e-4cfc-a6c3-4de222246896" (UID: "32139bd7-916e-4cfc-a6c3-4de222246896"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.373039 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkc4j\" (UniqueName: \"kubernetes.io/projected/32139bd7-916e-4cfc-a6c3-4de222246896-kube-api-access-tkc4j\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.373070 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.376218 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32139bd7-916e-4cfc-a6c3-4de222246896" (UID: "32139bd7-916e-4cfc-a6c3-4de222246896"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.390113 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-config" (OuterVolumeSpecName: "config") pod "32139bd7-916e-4cfc-a6c3-4de222246896" (UID: "32139bd7-916e-4cfc-a6c3-4de222246896"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.420830 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32139bd7-916e-4cfc-a6c3-4de222246896" (UID: "32139bd7-916e-4cfc-a6c3-4de222246896"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.425363 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "32139bd7-916e-4cfc-a6c3-4de222246896" (UID: "32139bd7-916e-4cfc-a6c3-4de222246896"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.474836 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-config\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.474871 4918 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.474884 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.474894 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32139bd7-916e-4cfc-a6c3-4de222246896-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.508736 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lnhw2"] Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.522070 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-99gbh"] Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.627826 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.751455 4918 generic.go:334] "Generic (PLEG): container finished" podID="543cd914-e3df-4dbe-a1ea-dc3adc4dcdda" containerID="35b2acf61ba384f78d1234655cb72c1add3bcb1e328781ff1605f3ca7df67942" exitCode=0 Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.751548 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" event={"ID":"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda","Type":"ContainerDied","Data":"35b2acf61ba384f78d1234655cb72c1add3bcb1e328781ff1605f3ca7df67942"} Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.751577 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" event={"ID":"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda","Type":"ContainerStarted","Data":"7a97150927e7031222ad1fac37f67572adc027cab7bf85f27d764880627c91ed"} Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.763085 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ce9dad6-2fa1-48f8-bd79-b114097ef3be","Type":"ContainerStarted","Data":"e721531ee49299413f095db8fee5d8290b032ebb86a864dda12f6c349aeeb03e"} Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.806058 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b5btd" event={"ID":"33152bb1-e526-420f-8dec-7ef80c68b47c","Type":"ContainerStarted","Data":"7a5fb1a58f9b5bdf45775c9843b2e4f45ea7157e76ac738a453272c7f5bb514b"} Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.809279 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" event={"ID":"1fdb44cf-8ddb-4561-8749-702ccf333279","Type":"ContainerStarted","Data":"f79fe307868901c8a6d5e0e93ebc8b1d12764f5b89376df96929a983ae1e89ec"} Mar 19 16:59:58 crc kubenswrapper[4918]: W0319 16:59:58.812456 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec2f9e01_6e64_4c5d_93d4_8428ae776a4e.slice/crio-76c6bf5d9173aaf551c687f23f756433602c5db2930083eb2a208d4f32fa5c59 WatchSource:0}: Error finding container 76c6bf5d9173aaf551c687f23f756433602c5db2930083eb2a208d4f32fa5c59: Status 404 returned error can't find the container with id 76c6bf5d9173aaf551c687f23f756433602c5db2930083eb2a208d4f32fa5c59 Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.812448 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4qpsq" event={"ID":"ca0f97bf-56d6-4dec-9727-b1d406e048c7","Type":"ContainerStarted","Data":"1b523cb6cbf897ec1d265fffd48b68c6452297fc90343e2da6d37a9c393593a8"} Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.835709 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wtk47"] Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.860423 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="d175bcf8fa8bff1bfb04d3a219eb7c4c6847a1adae22fbf62149bc4b8894f0f0" exitCode=0 Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.860514 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"d175bcf8fa8bff1bfb04d3a219eb7c4c6847a1adae22fbf62149bc4b8894f0f0"} Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.860833 4918 scope.go:117] "RemoveContainer" containerID="ef900c9cacbbbaa6c19a9d710e828147883364fff3c1249c3116a090d326556c" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.869131 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-99gbh" event={"ID":"e24181b4-a2be-4ea7-9602-e3e16b8862c1","Type":"ContainerStarted","Data":"12fc32b0de6c67270b30670ebe7586b686dc46d45bd0b5a4822050fa566f911e"} Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.909793 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s9rgt"] Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.910982 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-d9qg5" event={"ID":"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25","Type":"ContainerStarted","Data":"263dfb542e1c7c9a0d33fea597df87e4895d3657e667255f196609788d2c1fc8"} Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.913211 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" event={"ID":"32139bd7-916e-4cfc-a6c3-4de222246896","Type":"ContainerDied","Data":"6eb478ba0798a5860798563940ba10864cea0a44cbe4f13f8999a9f25a614712"} Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.913294 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-2jnlk" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.930359 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4qpsq" podStartSLOduration=3.930335958 podStartE2EDuration="3.930335958s" podCreationTimestamp="2026-03-19 16:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 16:59:58.845452322 +0000 UTC m=+1210.967651570" watchObservedRunningTime="2026-03-19 16:59:58.930335958 +0000 UTC m=+1211.052535206" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.954745 4918 scope.go:117] "RemoveContainer" containerID="f6bb3323a03f9f99f47fe65c023f57cde0eea0868ef1731ea9cb23c370f9cfa8" Mar 19 16:59:58 crc kubenswrapper[4918]: I0319 16:59:58.971575 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2jnlk"] Mar 19 16:59:59 crc kubenswrapper[4918]: I0319 16:59:59.029254 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-2jnlk"] Mar 19 16:59:59 crc kubenswrapper[4918]: W0319 16:59:59.030061 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b04ffe9_4130_4428_ba7e_e41bed199d74.slice/crio-1d4e2add4294ce38245a24034cda6c274dee58fd2d1e4f76fef46a47e01840dc WatchSource:0}: Error finding container 1d4e2add4294ce38245a24034cda6c274dee58fd2d1e4f76fef46a47e01840dc: Status 404 returned error can't find the container with id 1d4e2add4294ce38245a24034cda6c274dee58fd2d1e4f76fef46a47e01840dc Mar 19 16:59:59 crc kubenswrapper[4918]: I0319 16:59:59.077150 4918 scope.go:117] "RemoveContainer" containerID="79eefb244158bfb361cc096b4d0135b36dcf78c0c6d0888034ac8ce2b1d3ac98" Mar 19 16:59:59 crc kubenswrapper[4918]: I0319 16:59:59.102482 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 16:59:59 crc kubenswrapper[4918]: I0319 16:59:59.136516 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 16:59:59 crc kubenswrapper[4918]: I0319 16:59:59.733385 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 16:59:59 crc kubenswrapper[4918]: I0319 16:59:59.794187 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 16:59:59 crc kubenswrapper[4918]: I0319 16:59:59.831652 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 16:59:59 crc kubenswrapper[4918]: I0319 16:59:59.968307 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s9rgt" event={"ID":"3cb250eb-e1c3-4a48-bf07-8cf4504466fb","Type":"ContainerStarted","Data":"3f33ca37b0eae21e86fbfb6b18b226d34922613d3ae125cba2178fe13ae15ae9"} Mar 19 16:59:59 crc kubenswrapper[4918]: I0319 16:59:59.976591 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"c06f60b9e3990852ac6fc7b59da3fe3cda8e2a2ae81b8e586f6da8fc956569f8"} Mar 19 16:59:59 crc kubenswrapper[4918]: I0319 16:59:59.981812 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" event={"ID":"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda","Type":"ContainerDied","Data":"7a97150927e7031222ad1fac37f67572adc027cab7bf85f27d764880627c91ed"} Mar 19 16:59:59 crc kubenswrapper[4918]: I0319 16:59:59.981867 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a97150927e7031222ad1fac37f67572adc027cab7bf85f27d764880627c91ed" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.002845 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3b04ffe9-4130-4428-ba7e-e41bed199d74","Type":"ContainerStarted","Data":"1d4e2add4294ce38245a24034cda6c274dee58fd2d1e4f76fef46a47e01840dc"} Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.006510 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.008008 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtk47" event={"ID":"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e","Type":"ContainerStarted","Data":"76c6bf5d9173aaf551c687f23f756433602c5db2930083eb2a208d4f32fa5c59"} Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.029010 4918 generic.go:334] "Generic (PLEG): container finished" podID="1fdb44cf-8ddb-4561-8749-702ccf333279" containerID="cefc5198a2e2c2da3134183bed4bf7df475e8c9f3b5156937dec3fce1c866751" exitCode=0 Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.029170 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" event={"ID":"1fdb44cf-8ddb-4561-8749-702ccf333279","Type":"ContainerDied","Data":"cefc5198a2e2c2da3134183bed4bf7df475e8c9f3b5156937dec3fce1c866751"} Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.061657 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"383704eb-7b74-4878-ae73-eb5a1f85a49f","Type":"ContainerStarted","Data":"c70b9d86618c2fb0a56bfcd147a1ab28b87aa2e6072c0bb48b0774d60935afe0"} Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.135377 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt5k7\" (UniqueName: \"kubernetes.io/projected/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-kube-api-access-zt5k7\") pod \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.135492 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-sb\") pod \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.135638 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-svc\") pod \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.135720 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-swift-storage-0\") pod \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.135827 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-config\") pod \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.135906 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-nb\") pod \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\" (UID: \"543cd914-e3df-4dbe-a1ea-dc3adc4dcdda\") " Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.178893 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-kube-api-access-zt5k7" (OuterVolumeSpecName: "kube-api-access-zt5k7") pod "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda" (UID: "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda"). InnerVolumeSpecName "kube-api-access-zt5k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.185738 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565660-72kgl"] Mar 19 17:00:00 crc kubenswrapper[4918]: E0319 17:00:00.186236 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543cd914-e3df-4dbe-a1ea-dc3adc4dcdda" containerName="init" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.186263 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="543cd914-e3df-4dbe-a1ea-dc3adc4dcdda" containerName="init" Mar 19 17:00:00 crc kubenswrapper[4918]: E0319 17:00:00.186299 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32139bd7-916e-4cfc-a6c3-4de222246896" containerName="dnsmasq-dns" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.186309 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="32139bd7-916e-4cfc-a6c3-4de222246896" containerName="dnsmasq-dns" Mar 19 17:00:00 crc kubenswrapper[4918]: E0319 17:00:00.186334 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32139bd7-916e-4cfc-a6c3-4de222246896" containerName="init" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.186346 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="32139bd7-916e-4cfc-a6c3-4de222246896" containerName="init" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.186585 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="543cd914-e3df-4dbe-a1ea-dc3adc4dcdda" containerName="init" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.186618 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="32139bd7-916e-4cfc-a6c3-4de222246896" containerName="dnsmasq-dns" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.187509 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565660-72kgl" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.188733 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-config" (OuterVolumeSpecName: "config") pod "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda" (UID: "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.190608 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.190877 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.192024 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.192944 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda" (UID: "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.216111 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda" (UID: "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.224646 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf"] Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.230193 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565660-72kgl"] Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.230616 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.234553 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.234887 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.237597 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda" (UID: "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.239691 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.239717 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.239730 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.239740 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.239752 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt5k7\" (UniqueName: \"kubernetes.io/projected/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-kube-api-access-zt5k7\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.243280 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf"] Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.283774 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda" (UID: "543cd914-e3df-4dbe-a1ea-dc3adc4dcdda"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.344404 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqdxn\" (UniqueName: \"kubernetes.io/projected/09b359fb-3f74-447c-a755-338a558fc429-kube-api-access-pqdxn\") pod \"auto-csr-approver-29565660-72kgl\" (UID: \"09b359fb-3f74-447c-a755-338a558fc429\") " pod="openshift-infra/auto-csr-approver-29565660-72kgl" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.344472 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-secret-volume\") pod \"collect-profiles-29565660-xghlf\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.344579 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8l77\" (UniqueName: \"kubernetes.io/projected/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-kube-api-access-w8l77\") pod \"collect-profiles-29565660-xghlf\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.344679 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-config-volume\") pod \"collect-profiles-29565660-xghlf\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.344769 4918 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.446852 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-config-volume\") pod \"collect-profiles-29565660-xghlf\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.446964 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqdxn\" (UniqueName: \"kubernetes.io/projected/09b359fb-3f74-447c-a755-338a558fc429-kube-api-access-pqdxn\") pod \"auto-csr-approver-29565660-72kgl\" (UID: \"09b359fb-3f74-447c-a755-338a558fc429\") " pod="openshift-infra/auto-csr-approver-29565660-72kgl" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.446998 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-secret-volume\") pod \"collect-profiles-29565660-xghlf\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.447093 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8l77\" (UniqueName: \"kubernetes.io/projected/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-kube-api-access-w8l77\") pod \"collect-profiles-29565660-xghlf\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.448318 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-config-volume\") pod \"collect-profiles-29565660-xghlf\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.456630 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-secret-volume\") pod \"collect-profiles-29565660-xghlf\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.476210 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqdxn\" (UniqueName: \"kubernetes.io/projected/09b359fb-3f74-447c-a755-338a558fc429-kube-api-access-pqdxn\") pod \"auto-csr-approver-29565660-72kgl\" (UID: \"09b359fb-3f74-447c-a755-338a558fc429\") " pod="openshift-infra/auto-csr-approver-29565660-72kgl" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.477993 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8l77\" (UniqueName: \"kubernetes.io/projected/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-kube-api-access-w8l77\") pod \"collect-profiles-29565660-xghlf\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.575907 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565660-72kgl" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.626502 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:00 crc kubenswrapper[4918]: I0319 17:00:00.629823 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32139bd7-916e-4cfc-a6c3-4de222246896" path="/var/lib/kubelet/pods/32139bd7-916e-4cfc-a6c3-4de222246896/volumes" Mar 19 17:00:01 crc kubenswrapper[4918]: I0319 17:00:01.090916 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s9rgt" event={"ID":"3cb250eb-e1c3-4a48-bf07-8cf4504466fb","Type":"ContainerStarted","Data":"7093aa8e10a56c9917db6b81465c1586af7d9d5bdaac5bb85d7c30092080d813"} Mar 19 17:00:01 crc kubenswrapper[4918]: I0319 17:00:01.116642 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-zbhl6" Mar 19 17:00:01 crc kubenswrapper[4918]: I0319 17:00:01.116694 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" event={"ID":"1fdb44cf-8ddb-4561-8749-702ccf333279","Type":"ContainerStarted","Data":"04294a5366c846d1d2758c9922edf070292f663c14dd49015287db01f707b159"} Mar 19 17:00:01 crc kubenswrapper[4918]: I0319 17:00:01.122944 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-s9rgt" podStartSLOduration=5.122925578 podStartE2EDuration="5.122925578s" podCreationTimestamp="2026-03-19 16:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:01.112243905 +0000 UTC m=+1213.234443153" watchObservedRunningTime="2026-03-19 17:00:01.122925578 +0000 UTC m=+1213.245124826" Mar 19 17:00:01 crc kubenswrapper[4918]: I0319 17:00:01.264865 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" podStartSLOduration=5.2648419650000005 podStartE2EDuration="5.264841965s" podCreationTimestamp="2026-03-19 16:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:01.18800245 +0000 UTC m=+1213.310201708" watchObservedRunningTime="2026-03-19 17:00:01.264841965 +0000 UTC m=+1213.387041213" Mar 19 17:00:01 crc kubenswrapper[4918]: I0319 17:00:01.358085 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zbhl6"] Mar 19 17:00:01 crc kubenswrapper[4918]: I0319 17:00:01.386655 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zbhl6"] Mar 19 17:00:01 crc kubenswrapper[4918]: I0319 17:00:01.500148 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565660-72kgl"] Mar 19 17:00:01 crc kubenswrapper[4918]: W0319 17:00:01.655098 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09b359fb_3f74_447c_a755_338a558fc429.slice/crio-32acbf44079890c23f642a663e3eebc3f20592819da4e482d8efb08cd51261ab WatchSource:0}: Error finding container 32acbf44079890c23f642a663e3eebc3f20592819da4e482d8efb08cd51261ab: Status 404 returned error can't find the container with id 32acbf44079890c23f642a663e3eebc3f20592819da4e482d8efb08cd51261ab Mar 19 17:00:01 crc kubenswrapper[4918]: I0319 17:00:01.801779 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf"] Mar 19 17:00:02 crc kubenswrapper[4918]: I0319 17:00:02.136258 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565660-72kgl" event={"ID":"09b359fb-3f74-447c-a755-338a558fc429","Type":"ContainerStarted","Data":"32acbf44079890c23f642a663e3eebc3f20592819da4e482d8efb08cd51261ab"} Mar 19 17:00:02 crc kubenswrapper[4918]: I0319 17:00:02.139928 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3b04ffe9-4130-4428-ba7e-e41bed199d74","Type":"ContainerStarted","Data":"64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f"} Mar 19 17:00:02 crc kubenswrapper[4918]: I0319 17:00:02.143037 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"383704eb-7b74-4878-ae73-eb5a1f85a49f","Type":"ContainerStarted","Data":"a33511089ce817e2c72a2b2f0c92182c9e954e778f91a30a6c2dd21b10249ebf"} Mar 19 17:00:02 crc kubenswrapper[4918]: I0319 17:00:02.145577 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" event={"ID":"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0","Type":"ContainerStarted","Data":"1995908b32b5c0438e6b45353ccdbe0e4527fb4605493b6ea9e06fc416347337"} Mar 19 17:00:02 crc kubenswrapper[4918]: I0319 17:00:02.146042 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 17:00:02 crc kubenswrapper[4918]: I0319 17:00:02.629511 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543cd914-e3df-4dbe-a1ea-dc3adc4dcdda" path="/var/lib/kubelet/pods/543cd914-e3df-4dbe-a1ea-dc3adc4dcdda/volumes" Mar 19 17:00:03 crc kubenswrapper[4918]: I0319 17:00:03.160076 4918 generic.go:334] "Generic (PLEG): container finished" podID="d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0" containerID="27bbbd91871ec1c6e6115be400bfdbe827f5eff11e7dc43f7a9066f923bfc02e" exitCode=0 Mar 19 17:00:03 crc kubenswrapper[4918]: I0319 17:00:03.160243 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" event={"ID":"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0","Type":"ContainerDied","Data":"27bbbd91871ec1c6e6115be400bfdbe827f5eff11e7dc43f7a9066f923bfc02e"} Mar 19 17:00:03 crc kubenswrapper[4918]: I0319 17:00:03.164374 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3b04ffe9-4130-4428-ba7e-e41bed199d74","Type":"ContainerStarted","Data":"013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104"} Mar 19 17:00:03 crc kubenswrapper[4918]: I0319 17:00:03.164511 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3b04ffe9-4130-4428-ba7e-e41bed199d74" containerName="glance-log" containerID="cri-o://64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f" gracePeriod=30 Mar 19 17:00:03 crc kubenswrapper[4918]: I0319 17:00:03.164779 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3b04ffe9-4130-4428-ba7e-e41bed199d74" containerName="glance-httpd" containerID="cri-o://013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104" gracePeriod=30 Mar 19 17:00:03 crc kubenswrapper[4918]: I0319 17:00:03.222456 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.222435038 podStartE2EDuration="7.222435038s" podCreationTimestamp="2026-03-19 16:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:03.21344645 +0000 UTC m=+1215.335645698" watchObservedRunningTime="2026-03-19 17:00:03.222435038 +0000 UTC m=+1215.344634286" Mar 19 17:00:03 crc kubenswrapper[4918]: I0319 17:00:03.446574 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 19 17:00:03 crc kubenswrapper[4918]: I0319 17:00:03.463856 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 19 17:00:03 crc kubenswrapper[4918]: I0319 17:00:03.969567 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.109253 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-scripts\") pod \"3b04ffe9-4130-4428-ba7e-e41bed199d74\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.109300 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-httpd-run\") pod \"3b04ffe9-4130-4428-ba7e-e41bed199d74\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.109337 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-config-data\") pod \"3b04ffe9-4130-4428-ba7e-e41bed199d74\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.109392 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-logs\") pod \"3b04ffe9-4130-4428-ba7e-e41bed199d74\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.109510 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"3b04ffe9-4130-4428-ba7e-e41bed199d74\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.109603 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-internal-tls-certs\") pod \"3b04ffe9-4130-4428-ba7e-e41bed199d74\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.109642 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-combined-ca-bundle\") pod \"3b04ffe9-4130-4428-ba7e-e41bed199d74\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.109670 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg2cp\" (UniqueName: \"kubernetes.io/projected/3b04ffe9-4130-4428-ba7e-e41bed199d74-kube-api-access-rg2cp\") pod \"3b04ffe9-4130-4428-ba7e-e41bed199d74\" (UID: \"3b04ffe9-4130-4428-ba7e-e41bed199d74\") " Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.110100 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-logs" (OuterVolumeSpecName: "logs") pod "3b04ffe9-4130-4428-ba7e-e41bed199d74" (UID: "3b04ffe9-4130-4428-ba7e-e41bed199d74"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.110488 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3b04ffe9-4130-4428-ba7e-e41bed199d74" (UID: "3b04ffe9-4130-4428-ba7e-e41bed199d74"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.119473 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-scripts" (OuterVolumeSpecName: "scripts") pod "3b04ffe9-4130-4428-ba7e-e41bed199d74" (UID: "3b04ffe9-4130-4428-ba7e-e41bed199d74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.126587 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b04ffe9-4130-4428-ba7e-e41bed199d74-kube-api-access-rg2cp" (OuterVolumeSpecName: "kube-api-access-rg2cp") pod "3b04ffe9-4130-4428-ba7e-e41bed199d74" (UID: "3b04ffe9-4130-4428-ba7e-e41bed199d74"). InnerVolumeSpecName "kube-api-access-rg2cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.142148 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37" (OuterVolumeSpecName: "glance") pod "3b04ffe9-4130-4428-ba7e-e41bed199d74" (UID: "3b04ffe9-4130-4428-ba7e-e41bed199d74"). InnerVolumeSpecName "pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.154072 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b04ffe9-4130-4428-ba7e-e41bed199d74" (UID: "3b04ffe9-4130-4428-ba7e-e41bed199d74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.190165 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3b04ffe9-4130-4428-ba7e-e41bed199d74" (UID: "3b04ffe9-4130-4428-ba7e-e41bed199d74"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.193555 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"383704eb-7b74-4878-ae73-eb5a1f85a49f","Type":"ContainerStarted","Data":"ecb60a6fce068304915b38a48846cb017c9c902a660869039bbe5f2f7c1ae150"} Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.193620 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="383704eb-7b74-4878-ae73-eb5a1f85a49f" containerName="glance-log" containerID="cri-o://a33511089ce817e2c72a2b2f0c92182c9e954e778f91a30a6c2dd21b10249ebf" gracePeriod=30 Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.193760 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="383704eb-7b74-4878-ae73-eb5a1f85a49f" containerName="glance-httpd" containerID="cri-o://ecb60a6fce068304915b38a48846cb017c9c902a660869039bbe5f2f7c1ae150" gracePeriod=30 Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.198681 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-config-data" (OuterVolumeSpecName: "config-data") pod "3b04ffe9-4130-4428-ba7e-e41bed199d74" (UID: "3b04ffe9-4130-4428-ba7e-e41bed199d74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.198951 4918 generic.go:334] "Generic (PLEG): container finished" podID="ca0f97bf-56d6-4dec-9727-b1d406e048c7" containerID="1b523cb6cbf897ec1d265fffd48b68c6452297fc90343e2da6d37a9c393593a8" exitCode=0 Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.199033 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4qpsq" event={"ID":"ca0f97bf-56d6-4dec-9727-b1d406e048c7","Type":"ContainerDied","Data":"1b523cb6cbf897ec1d265fffd48b68c6452297fc90343e2da6d37a9c393593a8"} Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.201356 4918 generic.go:334] "Generic (PLEG): container finished" podID="3b04ffe9-4130-4428-ba7e-e41bed199d74" containerID="013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104" exitCode=143 Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.201397 4918 generic.go:334] "Generic (PLEG): container finished" podID="3b04ffe9-4130-4428-ba7e-e41bed199d74" containerID="64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f" exitCode=143 Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.202580 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3b04ffe9-4130-4428-ba7e-e41bed199d74","Type":"ContainerDied","Data":"013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104"} Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.202618 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3b04ffe9-4130-4428-ba7e-e41bed199d74","Type":"ContainerDied","Data":"64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f"} Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.202631 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3b04ffe9-4130-4428-ba7e-e41bed199d74","Type":"ContainerDied","Data":"1d4e2add4294ce38245a24034cda6c274dee58fd2d1e4f76fef46a47e01840dc"} Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.202648 4918 scope.go:117] "RemoveContainer" containerID="013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.202683 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.210034 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.214544 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.214511874 podStartE2EDuration="8.214511874s" podCreationTimestamp="2026-03-19 16:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:04.213108336 +0000 UTC m=+1216.335307584" watchObservedRunningTime="2026-03-19 17:00:04.214511874 +0000 UTC m=+1216.336711122" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.218633 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.218673 4918 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.218684 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.218700 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b04ffe9-4130-4428-ba7e-e41bed199d74-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.218803 4918 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") on node \"crc\" " Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.218825 4918 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.218839 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b04ffe9-4130-4428-ba7e-e41bed199d74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.218848 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg2cp\" (UniqueName: \"kubernetes.io/projected/3b04ffe9-4130-4428-ba7e-e41bed199d74-kube-api-access-rg2cp\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.273952 4918 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.274154 4918 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37") on node "crc" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.324107 4918 reconciler_common.go:293] "Volume detached for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.375580 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.394459 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.425300 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 17:00:04 crc kubenswrapper[4918]: E0319 17:00:04.440918 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b04ffe9-4130-4428-ba7e-e41bed199d74" containerName="glance-httpd" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.440965 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b04ffe9-4130-4428-ba7e-e41bed199d74" containerName="glance-httpd" Mar 19 17:00:04 crc kubenswrapper[4918]: E0319 17:00:04.440986 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b04ffe9-4130-4428-ba7e-e41bed199d74" containerName="glance-log" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.440991 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b04ffe9-4130-4428-ba7e-e41bed199d74" containerName="glance-log" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.441158 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b04ffe9-4130-4428-ba7e-e41bed199d74" containerName="glance-log" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.441180 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b04ffe9-4130-4428-ba7e-e41bed199d74" containerName="glance-httpd" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.442187 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.445948 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.446192 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.451814 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.540647 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.540704 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4m9n\" (UniqueName: \"kubernetes.io/projected/d659099f-7a05-4f1c-a097-67ecce42275d-kube-api-access-r4m9n\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.540746 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.540767 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.540789 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.540817 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.540839 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.540885 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.631118 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b04ffe9-4130-4428-ba7e-e41bed199d74" path="/var/lib/kubelet/pods/3b04ffe9-4130-4428-ba7e-e41bed199d74/volumes" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.645578 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.645714 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.645752 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4m9n\" (UniqueName: \"kubernetes.io/projected/d659099f-7a05-4f1c-a097-67ecce42275d-kube-api-access-r4m9n\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.645798 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.645817 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.645837 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.645863 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.645889 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.656761 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.666703 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-logs\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.709574 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.709614 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ea238a48943c00d3b8fe8315d317d6aa508a60b77f6685b492c061941b28c63f/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.872114 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.944122 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.944755 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.953096 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.955729 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:04 crc kubenswrapper[4918]: I0319 17:00:04.957838 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4m9n\" (UniqueName: \"kubernetes.io/projected/d659099f-7a05-4f1c-a097-67ecce42275d-kube-api-access-r4m9n\") pod \"glance-default-internal-api-0\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:00:05 crc kubenswrapper[4918]: I0319 17:00:05.095101 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 17:00:05 crc kubenswrapper[4918]: I0319 17:00:05.225654 4918 generic.go:334] "Generic (PLEG): container finished" podID="383704eb-7b74-4878-ae73-eb5a1f85a49f" containerID="ecb60a6fce068304915b38a48846cb017c9c902a660869039bbe5f2f7c1ae150" exitCode=0 Mar 19 17:00:05 crc kubenswrapper[4918]: I0319 17:00:05.225701 4918 generic.go:334] "Generic (PLEG): container finished" podID="383704eb-7b74-4878-ae73-eb5a1f85a49f" containerID="a33511089ce817e2c72a2b2f0c92182c9e954e778f91a30a6c2dd21b10249ebf" exitCode=143 Mar 19 17:00:05 crc kubenswrapper[4918]: I0319 17:00:05.225780 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"383704eb-7b74-4878-ae73-eb5a1f85a49f","Type":"ContainerDied","Data":"ecb60a6fce068304915b38a48846cb017c9c902a660869039bbe5f2f7c1ae150"} Mar 19 17:00:05 crc kubenswrapper[4918]: I0319 17:00:05.225832 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"383704eb-7b74-4878-ae73-eb5a1f85a49f","Type":"ContainerDied","Data":"a33511089ce817e2c72a2b2f0c92182c9e954e778f91a30a6c2dd21b10249ebf"} Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.167671 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.212887 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.227978 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pbmts"] Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.228199 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-pbmts" podUID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerName="dnsmasq-dns" containerID="cri-o://45c68780a1d7a2b68a09da826e86cebdba1ce0b0d9250d5f3a787e520da606eb" gracePeriod=10 Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.256107 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4qpsq" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.268024 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" event={"ID":"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0","Type":"ContainerDied","Data":"1995908b32b5c0438e6b45353ccdbe0e4527fb4605493b6ea9e06fc416347337"} Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.268064 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1995908b32b5c0438e6b45353ccdbe0e4527fb4605493b6ea9e06fc416347337" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.268114 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.286576 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4qpsq" event={"ID":"ca0f97bf-56d6-4dec-9727-b1d406e048c7","Type":"ContainerDied","Data":"5ba28ac6be3f209e6c54e2b6ef648cfc15b1b1e78e03b1bcb0d109471152c22e"} Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.286610 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ba28ac6be3f209e6c54e2b6ef648cfc15b1b1e78e03b1bcb0d109471152c22e" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.286630 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4qpsq" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.413266 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-combined-ca-bundle\") pod \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.413340 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-fernet-keys\") pod \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.413406 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-config-volume\") pod \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.413435 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-scripts\") pod \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.413468 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8l77\" (UniqueName: \"kubernetes.io/projected/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-kube-api-access-w8l77\") pod \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.413557 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-secret-volume\") pod \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\" (UID: \"d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0\") " Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.413606 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-config-data\") pod \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.413658 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-credential-keys\") pod \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.413693 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd9j9\" (UniqueName: \"kubernetes.io/projected/ca0f97bf-56d6-4dec-9727-b1d406e048c7-kube-api-access-cd9j9\") pod \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\" (UID: \"ca0f97bf-56d6-4dec-9727-b1d406e048c7\") " Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.424162 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-config-volume" (OuterVolumeSpecName: "config-volume") pod "d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0" (UID: "d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.425544 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-scripts" (OuterVolumeSpecName: "scripts") pod "ca0f97bf-56d6-4dec-9727-b1d406e048c7" (UID: "ca0f97bf-56d6-4dec-9727-b1d406e048c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.429294 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0f97bf-56d6-4dec-9727-b1d406e048c7-kube-api-access-cd9j9" (OuterVolumeSpecName: "kube-api-access-cd9j9") pod "ca0f97bf-56d6-4dec-9727-b1d406e048c7" (UID: "ca0f97bf-56d6-4dec-9727-b1d406e048c7"). InnerVolumeSpecName "kube-api-access-cd9j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.429392 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-kube-api-access-w8l77" (OuterVolumeSpecName: "kube-api-access-w8l77") pod "d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0" (UID: "d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0"). InnerVolumeSpecName "kube-api-access-w8l77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.429885 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ca0f97bf-56d6-4dec-9727-b1d406e048c7" (UID: "ca0f97bf-56d6-4dec-9727-b1d406e048c7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.429910 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ca0f97bf-56d6-4dec-9727-b1d406e048c7" (UID: "ca0f97bf-56d6-4dec-9727-b1d406e048c7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.431200 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0" (UID: "d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.464405 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca0f97bf-56d6-4dec-9727-b1d406e048c7" (UID: "ca0f97bf-56d6-4dec-9727-b1d406e048c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.470430 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-config-data" (OuterVolumeSpecName: "config-data") pod "ca0f97bf-56d6-4dec-9727-b1d406e048c7" (UID: "ca0f97bf-56d6-4dec-9727-b1d406e048c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.519268 4918 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.519617 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.519630 4918 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.519643 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd9j9\" (UniqueName: \"kubernetes.io/projected/ca0f97bf-56d6-4dec-9727-b1d406e048c7-kube-api-access-cd9j9\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.519658 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.519670 4918 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.519681 4918 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.519692 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca0f97bf-56d6-4dec-9727-b1d406e048c7-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:07 crc kubenswrapper[4918]: I0319 17:00:07.519706 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8l77\" (UniqueName: \"kubernetes.io/projected/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0-kube-api-access-w8l77\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.318252 4918 generic.go:334] "Generic (PLEG): container finished" podID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerID="45c68780a1d7a2b68a09da826e86cebdba1ce0b0d9250d5f3a787e520da606eb" exitCode=0 Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.318309 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pbmts" event={"ID":"f01a3cda-80e8-4b64-9496-c1ec001e0e9d","Type":"ContainerDied","Data":"45c68780a1d7a2b68a09da826e86cebdba1ce0b0d9250d5f3a787e520da606eb"} Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.412468 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4qpsq"] Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.423107 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4qpsq"] Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.526687 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xbp92"] Mar 19 17:00:08 crc kubenswrapper[4918]: E0319 17:00:08.527187 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0f97bf-56d6-4dec-9727-b1d406e048c7" containerName="keystone-bootstrap" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.527209 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0f97bf-56d6-4dec-9727-b1d406e048c7" containerName="keystone-bootstrap" Mar 19 17:00:08 crc kubenswrapper[4918]: E0319 17:00:08.527230 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0" containerName="collect-profiles" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.527236 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0" containerName="collect-profiles" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.527444 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0" containerName="collect-profiles" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.527471 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca0f97bf-56d6-4dec-9727-b1d406e048c7" containerName="keystone-bootstrap" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.528146 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.528919 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xbp92"] Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.535038 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.535154 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.535304 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.537005 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d72v8" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.543537 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-pbmts" podUID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.597063 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca0f97bf-56d6-4dec-9727-b1d406e048c7" path="/var/lib/kubelet/pods/ca0f97bf-56d6-4dec-9727-b1d406e048c7/volumes" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.643633 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-config-data\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.643750 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-combined-ca-bundle\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.643786 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-credential-keys\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.643845 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4ssp\" (UniqueName: \"kubernetes.io/projected/044c141c-5c54-4e8c-a592-497a22f6f4db-kube-api-access-z4ssp\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.643871 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-fernet-keys\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.643927 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-scripts\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.746449 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-combined-ca-bundle\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.746571 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-credential-keys\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.749756 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4ssp\" (UniqueName: \"kubernetes.io/projected/044c141c-5c54-4e8c-a592-497a22f6f4db-kube-api-access-z4ssp\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.749806 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-fernet-keys\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.749893 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-scripts\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.749916 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-config-data\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.753464 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-credential-keys\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.753749 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-combined-ca-bundle\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.754899 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-fernet-keys\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.756356 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-scripts\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.758287 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-config-data\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.768406 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4ssp\" (UniqueName: \"kubernetes.io/projected/044c141c-5c54-4e8c-a592-497a22f6f4db-kube-api-access-z4ssp\") pod \"keystone-bootstrap-xbp92\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:08 crc kubenswrapper[4918]: I0319 17:00:08.853321 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.654021 4918 scope.go:117] "RemoveContainer" containerID="64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f" Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.774646 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.910301 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-logs\") pod \"383704eb-7b74-4878-ae73-eb5a1f85a49f\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.910457 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-combined-ca-bundle\") pod \"383704eb-7b74-4878-ae73-eb5a1f85a49f\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.910502 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-logs" (OuterVolumeSpecName: "logs") pod "383704eb-7b74-4878-ae73-eb5a1f85a49f" (UID: "383704eb-7b74-4878-ae73-eb5a1f85a49f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.910507 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-httpd-run\") pod \"383704eb-7b74-4878-ae73-eb5a1f85a49f\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.910611 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-scripts\") pod \"383704eb-7b74-4878-ae73-eb5a1f85a49f\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.910688 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-public-tls-certs\") pod \"383704eb-7b74-4878-ae73-eb5a1f85a49f\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.910944 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "383704eb-7b74-4878-ae73-eb5a1f85a49f" (UID: "383704eb-7b74-4878-ae73-eb5a1f85a49f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.911023 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-config-data\") pod \"383704eb-7b74-4878-ae73-eb5a1f85a49f\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.911218 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6szj\" (UniqueName: \"kubernetes.io/projected/383704eb-7b74-4878-ae73-eb5a1f85a49f-kube-api-access-t6szj\") pod \"383704eb-7b74-4878-ae73-eb5a1f85a49f\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.911367 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"383704eb-7b74-4878-ae73-eb5a1f85a49f\" (UID: \"383704eb-7b74-4878-ae73-eb5a1f85a49f\") " Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.911864 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.911882 4918 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/383704eb-7b74-4878-ae73-eb5a1f85a49f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.919364 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383704eb-7b74-4878-ae73-eb5a1f85a49f-kube-api-access-t6szj" (OuterVolumeSpecName: "kube-api-access-t6szj") pod "383704eb-7b74-4878-ae73-eb5a1f85a49f" (UID: "383704eb-7b74-4878-ae73-eb5a1f85a49f"). InnerVolumeSpecName "kube-api-access-t6szj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.919461 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-scripts" (OuterVolumeSpecName: "scripts") pod "383704eb-7b74-4878-ae73-eb5a1f85a49f" (UID: "383704eb-7b74-4878-ae73-eb5a1f85a49f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.964073 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "383704eb-7b74-4878-ae73-eb5a1f85a49f" (UID: "383704eb-7b74-4878-ae73-eb5a1f85a49f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.971939 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22" (OuterVolumeSpecName: "glance") pod "383704eb-7b74-4878-ae73-eb5a1f85a49f" (UID: "383704eb-7b74-4878-ae73-eb5a1f85a49f"). InnerVolumeSpecName "pvc-d51f6764-e413-494d-9e7a-c5583dec6a22". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 17:00:11 crc kubenswrapper[4918]: I0319 17:00:11.993845 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-config-data" (OuterVolumeSpecName: "config-data") pod "383704eb-7b74-4878-ae73-eb5a1f85a49f" (UID: "383704eb-7b74-4878-ae73-eb5a1f85a49f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.013955 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.013983 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.013993 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.014001 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6szj\" (UniqueName: \"kubernetes.io/projected/383704eb-7b74-4878-ae73-eb5a1f85a49f-kube-api-access-t6szj\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.014031 4918 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") on node \"crc\" " Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.042182 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "383704eb-7b74-4878-ae73-eb5a1f85a49f" (UID: "383704eb-7b74-4878-ae73-eb5a1f85a49f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.049430 4918 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.049597 4918 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d51f6764-e413-494d-9e7a-c5583dec6a22" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22") on node "crc" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.115874 4918 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/383704eb-7b74-4878-ae73-eb5a1f85a49f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.115904 4918 reconciler_common.go:293] "Volume detached for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.361209 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"383704eb-7b74-4878-ae73-eb5a1f85a49f","Type":"ContainerDied","Data":"c70b9d86618c2fb0a56bfcd147a1ab28b87aa2e6072c0bb48b0774d60935afe0"} Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.361242 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.402939 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.415604 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.431914 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 17:00:12 crc kubenswrapper[4918]: E0319 17:00:12.432377 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383704eb-7b74-4878-ae73-eb5a1f85a49f" containerName="glance-log" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.432391 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="383704eb-7b74-4878-ae73-eb5a1f85a49f" containerName="glance-log" Mar 19 17:00:12 crc kubenswrapper[4918]: E0319 17:00:12.432415 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383704eb-7b74-4878-ae73-eb5a1f85a49f" containerName="glance-httpd" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.432421 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="383704eb-7b74-4878-ae73-eb5a1f85a49f" containerName="glance-httpd" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.432617 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="383704eb-7b74-4878-ae73-eb5a1f85a49f" containerName="glance-httpd" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.432629 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="383704eb-7b74-4878-ae73-eb5a1f85a49f" containerName="glance-log" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.433714 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.436473 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.437014 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.470906 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.600201 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383704eb-7b74-4878-ae73-eb5a1f85a49f" path="/var/lib/kubelet/pods/383704eb-7b74-4878-ae73-eb5a1f85a49f/volumes" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.625560 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp6hd\" (UniqueName: \"kubernetes.io/projected/eee99f54-a76f-416d-a14f-cebf9d11548b-kube-api-access-bp6hd\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.625606 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-logs\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.625653 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.625867 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.625919 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.626404 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.626664 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.626704 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.728724 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.728773 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.728871 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.729438 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.728936 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.729704 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.730157 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp6hd\" (UniqueName: \"kubernetes.io/projected/eee99f54-a76f-416d-a14f-cebf9d11548b-kube-api-access-bp6hd\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.730226 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-logs\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.730304 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.730631 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-logs\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.731861 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.731890 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/342db21bb7c2e49b22a24134653a3d87a173d64abb89a0070323b0a8e0ff9956/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.734504 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-scripts\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.735596 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.739155 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-config-data\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.741648 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.754087 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp6hd\" (UniqueName: \"kubernetes.io/projected/eee99f54-a76f-416d-a14f-cebf9d11548b-kube-api-access-bp6hd\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:12 crc kubenswrapper[4918]: I0319 17:00:12.774830 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " pod="openstack/glance-default-external-api-0" Mar 19 17:00:13 crc kubenswrapper[4918]: I0319 17:00:13.051660 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 17:00:18 crc kubenswrapper[4918]: I0319 17:00:18.543660 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-pbmts" podUID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Mar 19 17:00:22 crc kubenswrapper[4918]: I0319 17:00:22.464985 4918 generic.go:334] "Generic (PLEG): container finished" podID="3cb250eb-e1c3-4a48-bf07-8cf4504466fb" containerID="7093aa8e10a56c9917db6b81465c1586af7d9d5bdaac5bb85d7c30092080d813" exitCode=0 Mar 19 17:00:22 crc kubenswrapper[4918]: I0319 17:00:22.465087 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s9rgt" event={"ID":"3cb250eb-e1c3-4a48-bf07-8cf4504466fb","Type":"ContainerDied","Data":"7093aa8e10a56c9917db6b81465c1586af7d9d5bdaac5bb85d7c30092080d813"} Mar 19 17:00:23 crc kubenswrapper[4918]: I0319 17:00:23.544832 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-pbmts" podUID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Mar 19 17:00:23 crc kubenswrapper[4918]: I0319 17:00:23.545419 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 17:00:23 crc kubenswrapper[4918]: E0319 17:00:23.837616 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 19 17:00:23 crc kubenswrapper[4918]: E0319 17:00:23.837805 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58bh57h668h597h655h87hbdhcfh5d4h566h69h54bh589h55bh585h5b4h5fdh5c4h54fhf7h67bh587h65dh5f8h64fh65h56dhf5h5d4h8dhf9h699q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-spf6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(5ce9dad6-2fa1-48f8-bd79-b114097ef3be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 17:00:24 crc kubenswrapper[4918]: E0319 17:00:24.307668 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 19 17:00:24 crc kubenswrapper[4918]: E0319 17:00:24.307831 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k96q5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-wtk47_openstack(ec2f9e01-6e64-4c5d-93d4-8428ae776a4e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 17:00:24 crc kubenswrapper[4918]: E0319 17:00:24.313498 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-wtk47" podUID="ec2f9e01-6e64-4c5d-93d4-8428ae776a4e" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.427478 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.493791 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pbmts" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.493833 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pbmts" event={"ID":"f01a3cda-80e8-4b64-9496-c1ec001e0e9d","Type":"ContainerDied","Data":"fe1adf8ab10e7380083f67a548df01d3d034d98f84df8f4533f5614d953d6640"} Mar 19 17:00:24 crc kubenswrapper[4918]: E0319 17:00:24.494603 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-wtk47" podUID="ec2f9e01-6e64-4c5d-93d4-8428ae776a4e" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.566413 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-dns-svc\") pod \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.566565 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhvmh\" (UniqueName: \"kubernetes.io/projected/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-kube-api-access-jhvmh\") pod \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.566735 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-config\") pod \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.566764 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-nb\") pod \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.566849 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-sb\") pod \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\" (UID: \"f01a3cda-80e8-4b64-9496-c1ec001e0e9d\") " Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.571948 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-kube-api-access-jhvmh" (OuterVolumeSpecName: "kube-api-access-jhvmh") pod "f01a3cda-80e8-4b64-9496-c1ec001e0e9d" (UID: "f01a3cda-80e8-4b64-9496-c1ec001e0e9d"). InnerVolumeSpecName "kube-api-access-jhvmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.614744 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-config" (OuterVolumeSpecName: "config") pod "f01a3cda-80e8-4b64-9496-c1ec001e0e9d" (UID: "f01a3cda-80e8-4b64-9496-c1ec001e0e9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.618054 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f01a3cda-80e8-4b64-9496-c1ec001e0e9d" (UID: "f01a3cda-80e8-4b64-9496-c1ec001e0e9d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.625064 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f01a3cda-80e8-4b64-9496-c1ec001e0e9d" (UID: "f01a3cda-80e8-4b64-9496-c1ec001e0e9d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.630000 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f01a3cda-80e8-4b64-9496-c1ec001e0e9d" (UID: "f01a3cda-80e8-4b64-9496-c1ec001e0e9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.669338 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.669368 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.669383 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.669397 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.669410 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhvmh\" (UniqueName: \"kubernetes.io/projected/f01a3cda-80e8-4b64-9496-c1ec001e0e9d-kube-api-access-jhvmh\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.798565 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.822769 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pbmts"] Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.830430 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pbmts"] Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.849164 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s9rgt" Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.975025 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-combined-ca-bundle\") pod \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.975107 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-config\") pod \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.975304 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzn8b\" (UniqueName: \"kubernetes.io/projected/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-kube-api-access-gzn8b\") pod \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\" (UID: \"3cb250eb-e1c3-4a48-bf07-8cf4504466fb\") " Mar 19 17:00:24 crc kubenswrapper[4918]: I0319 17:00:24.978762 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-kube-api-access-gzn8b" (OuterVolumeSpecName: "kube-api-access-gzn8b") pod "3cb250eb-e1c3-4a48-bf07-8cf4504466fb" (UID: "3cb250eb-e1c3-4a48-bf07-8cf4504466fb"). InnerVolumeSpecName "kube-api-access-gzn8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:25 crc kubenswrapper[4918]: I0319 17:00:25.001657 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-config" (OuterVolumeSpecName: "config") pod "3cb250eb-e1c3-4a48-bf07-8cf4504466fb" (UID: "3cb250eb-e1c3-4a48-bf07-8cf4504466fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:25 crc kubenswrapper[4918]: I0319 17:00:25.001699 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cb250eb-e1c3-4a48-bf07-8cf4504466fb" (UID: "3cb250eb-e1c3-4a48-bf07-8cf4504466fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:25 crc kubenswrapper[4918]: I0319 17:00:25.078709 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzn8b\" (UniqueName: \"kubernetes.io/projected/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-kube-api-access-gzn8b\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:25 crc kubenswrapper[4918]: I0319 17:00:25.078749 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:25 crc kubenswrapper[4918]: I0319 17:00:25.078761 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3cb250eb-e1c3-4a48-bf07-8cf4504466fb-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:25 crc kubenswrapper[4918]: I0319 17:00:25.505713 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s9rgt" event={"ID":"3cb250eb-e1c3-4a48-bf07-8cf4504466fb","Type":"ContainerDied","Data":"3f33ca37b0eae21e86fbfb6b18b226d34922613d3ae125cba2178fe13ae15ae9"} Mar 19 17:00:25 crc kubenswrapper[4918]: I0319 17:00:25.505963 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f33ca37b0eae21e86fbfb6b18b226d34922613d3ae125cba2178fe13ae15ae9" Mar 19 17:00:25 crc kubenswrapper[4918]: I0319 17:00:25.505912 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s9rgt" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.040228 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-frrgq"] Mar 19 17:00:26 crc kubenswrapper[4918]: E0319 17:00:26.040640 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb250eb-e1c3-4a48-bf07-8cf4504466fb" containerName="neutron-db-sync" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.040653 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb250eb-e1c3-4a48-bf07-8cf4504466fb" containerName="neutron-db-sync" Mar 19 17:00:26 crc kubenswrapper[4918]: E0319 17:00:26.040671 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerName="init" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.040677 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerName="init" Mar 19 17:00:26 crc kubenswrapper[4918]: E0319 17:00:26.040699 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerName="dnsmasq-dns" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.040705 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerName="dnsmasq-dns" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.040867 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb250eb-e1c3-4a48-bf07-8cf4504466fb" containerName="neutron-db-sync" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.040880 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerName="dnsmasq-dns" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.041832 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.054415 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-frrgq"] Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.107862 4918 scope.go:117] "RemoveContainer" containerID="013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104" Mar 19 17:00:26 crc kubenswrapper[4918]: E0319 17:00:26.113240 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104\": container with ID starting with 013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104 not found: ID does not exist" containerID="013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.113300 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104"} err="failed to get container status \"013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104\": rpc error: code = NotFound desc = could not find container \"013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104\": container with ID starting with 013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104 not found: ID does not exist" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.113336 4918 scope.go:117] "RemoveContainer" containerID="64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f" Mar 19 17:00:26 crc kubenswrapper[4918]: E0319 17:00:26.121449 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f\": container with ID starting with 64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f not found: ID does not exist" containerID="64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.121612 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f"} err="failed to get container status \"64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f\": rpc error: code = NotFound desc = could not find container \"64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f\": container with ID starting with 64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f not found: ID does not exist" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.121702 4918 scope.go:117] "RemoveContainer" containerID="013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.122135 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104"} err="failed to get container status \"013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104\": rpc error: code = NotFound desc = could not find container \"013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104\": container with ID starting with 013a53f4be3e710992cbf91e3cd06ff86ea128ca10e3ce4de2d8795b71969104 not found: ID does not exist" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.122182 4918 scope.go:117] "RemoveContainer" containerID="64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.122565 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f"} err="failed to get container status \"64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f\": rpc error: code = NotFound desc = could not find container \"64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f\": container with ID starting with 64380dae0d310b920c13e7f27000b4b0432304fd197382eb0983e89b8ce8707f not found: ID does not exist" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.122587 4918 scope.go:117] "RemoveContainer" containerID="ecb60a6fce068304915b38a48846cb017c9c902a660869039bbe5f2f7c1ae150" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.157217 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5bb7fd774d-vnxdq"] Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.162307 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.166760 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6zfw8" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.166917 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.167026 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.167325 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.177223 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bb7fd774d-vnxdq"] Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.204885 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.204978 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.205091 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-svc\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.205127 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7dbl\" (UniqueName: \"kubernetes.io/projected/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-kube-api-access-s7dbl\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.205171 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-config\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.205215 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.307217 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.307306 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-combined-ca-bundle\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.307329 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98vlc\" (UniqueName: \"kubernetes.io/projected/0825706f-8acf-485c-82dd-d5c672b187e8-kube-api-access-98vlc\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.307399 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-httpd-config\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.307453 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-svc\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.307477 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7dbl\" (UniqueName: \"kubernetes.io/projected/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-kube-api-access-s7dbl\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.308546 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-svc\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.308611 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-config\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.308630 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.308645 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-config\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.308728 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.308833 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-ovndb-tls-certs\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.309191 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.309886 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.310318 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.310439 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-config\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.333924 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7dbl\" (UniqueName: \"kubernetes.io/projected/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-kube-api-access-s7dbl\") pod \"dnsmasq-dns-55f844cf75-frrgq\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.367516 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.411312 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-combined-ca-bundle\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.411355 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98vlc\" (UniqueName: \"kubernetes.io/projected/0825706f-8acf-485c-82dd-d5c672b187e8-kube-api-access-98vlc\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.411379 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-httpd-config\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.411431 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-config\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.411461 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-ovndb-tls-certs\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.416404 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-config\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.417384 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-ovndb-tls-certs\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.422479 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-combined-ca-bundle\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.422621 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-httpd-config\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.429629 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98vlc\" (UniqueName: \"kubernetes.io/projected/0825706f-8acf-485c-82dd-d5c672b187e8-kube-api-access-98vlc\") pod \"neutron-5bb7fd774d-vnxdq\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.505937 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.514444 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d659099f-7a05-4f1c-a097-67ecce42275d","Type":"ContainerStarted","Data":"712e807e0e7f36a6a3eaa4df041bd324bcdf655090e173964e51bd72ddcbee0d"} Mar 19 17:00:26 crc kubenswrapper[4918]: I0319 17:00:26.599026 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" path="/var/lib/kubelet/pods/f01a3cda-80e8-4b64-9496-c1ec001e0e9d/volumes" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.176136 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bd699b55c-ncb4d"] Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.177802 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.181812 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.195842 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bd699b55c-ncb4d"] Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.198112 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.368016 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-ovndb-tls-certs\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.368082 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-internal-tls-certs\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.368113 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-combined-ca-bundle\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.368448 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-config\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.368670 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-httpd-config\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.368742 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vb8f\" (UniqueName: \"kubernetes.io/projected/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-kube-api-access-4vb8f\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.368963 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-public-tls-certs\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.470390 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-config\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.470458 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-httpd-config\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.470482 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vb8f\" (UniqueName: \"kubernetes.io/projected/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-kube-api-access-4vb8f\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.470579 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-public-tls-certs\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.470632 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-ovndb-tls-certs\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.470661 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-internal-tls-certs\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.470681 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-combined-ca-bundle\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.479554 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-httpd-config\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.480748 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-ovndb-tls-certs\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.482137 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-internal-tls-certs\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.495689 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-config\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.511594 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vb8f\" (UniqueName: \"kubernetes.io/projected/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-kube-api-access-4vb8f\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.512724 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-combined-ca-bundle\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.514736 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-public-tls-certs\") pod \"neutron-bd699b55c-ncb4d\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.546307 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-pbmts" podUID="f01a3cda-80e8-4b64-9496-c1ec001e0e9d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Mar 19 17:00:28 crc kubenswrapper[4918]: E0319 17:00:28.638505 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 19 17:00:28 crc kubenswrapper[4918]: E0319 17:00:28.638817 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btqj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-b5btd_openstack(33152bb1-e526-420f-8dec-7ef80c68b47c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 17:00:28 crc kubenswrapper[4918]: E0319 17:00:28.640059 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-b5btd" podUID="33152bb1-e526-420f-8dec-7ef80c68b47c" Mar 19 17:00:28 crc kubenswrapper[4918]: I0319 17:00:28.808743 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:29 crc kubenswrapper[4918]: I0319 17:00:29.384793 4918 scope.go:117] "RemoveContainer" containerID="a33511089ce817e2c72a2b2f0c92182c9e954e778f91a30a6c2dd21b10249ebf" Mar 19 17:00:29 crc kubenswrapper[4918]: E0319 17:00:29.575878 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-b5btd" podUID="33152bb1-e526-420f-8dec-7ef80c68b47c" Mar 19 17:00:29 crc kubenswrapper[4918]: I0319 17:00:29.846448 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 17:00:29 crc kubenswrapper[4918]: I0319 17:00:29.879361 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xbp92"] Mar 19 17:00:32 crc kubenswrapper[4918]: I0319 17:00:32.600803 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eee99f54-a76f-416d-a14f-cebf9d11548b","Type":"ContainerStarted","Data":"22255a7532df72df20a069895d64f76c18741bf8be18c6ae2872320386792391"} Mar 19 17:00:32 crc kubenswrapper[4918]: I0319 17:00:32.601333 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbp92" event={"ID":"044c141c-5c54-4e8c-a592-497a22f6f4db","Type":"ContainerStarted","Data":"083be2d423d77403f814126a1da7f9970738f1959b0b98d94e347a989fd5843e"} Mar 19 17:00:33 crc kubenswrapper[4918]: I0319 17:00:33.026822 4918 scope.go:117] "RemoveContainer" containerID="45c68780a1d7a2b68a09da826e86cebdba1ce0b0d9250d5f3a787e520da606eb" Mar 19 17:00:33 crc kubenswrapper[4918]: E0319 17:00:33.653695 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Mar 19 17:00:33 crc kubenswrapper[4918]: E0319 17:00:33.654025 4918 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Mar 19 17:00:33 crc kubenswrapper[4918]: E0319 17:00:33.654152 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lt5j7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-d9qg5_openstack(a5df5afd-edbf-49fd-b9b8-35aa33fb5d25): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 17:00:33 crc kubenswrapper[4918]: E0319 17:00:33.655766 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-d9qg5" podUID="a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" Mar 19 17:00:33 crc kubenswrapper[4918]: I0319 17:00:33.710794 4918 scope.go:117] "RemoveContainer" containerID="d6eaf3940247a852f3fd2522e7139b00f08524536da3ad5d82385069d3b1de45" Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.122289 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-frrgq"] Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.203952 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bd699b55c-ncb4d"] Mar 19 17:00:34 crc kubenswrapper[4918]: W0319 17:00:34.236617 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ab5151a_6a64_47a2_8e0b_47455e4f66b0.slice/crio-86444fbf288072cbc7605672ab34c69a514a468427d17d4440d9af0adb1909ed WatchSource:0}: Error finding container 86444fbf288072cbc7605672ab34c69a514a468427d17d4440d9af0adb1909ed: Status 404 returned error can't find the container with id 86444fbf288072cbc7605672ab34c69a514a468427d17d4440d9af0adb1909ed Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.326735 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bb7fd774d-vnxdq"] Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.676750 4918 generic.go:334] "Generic (PLEG): container finished" podID="09b359fb-3f74-447c-a755-338a558fc429" containerID="9f32ad649e34ae83f5351e036be452e31fbf2e0ad374a247afd59c5c63a436a0" exitCode=0 Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.676836 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565660-72kgl" event={"ID":"09b359fb-3f74-447c-a755-338a558fc429","Type":"ContainerDied","Data":"9f32ad649e34ae83f5351e036be452e31fbf2e0ad374a247afd59c5c63a436a0"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.694200 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb7fd774d-vnxdq" event={"ID":"0825706f-8acf-485c-82dd-d5c672b187e8","Type":"ContainerStarted","Data":"5a1e47eb79aaeb3b5c7b2d0aa6f667ef89dded02f15708716aab3f5c0d6bbb07"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.694249 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb7fd774d-vnxdq" event={"ID":"0825706f-8acf-485c-82dd-d5c672b187e8","Type":"ContainerStarted","Data":"344f07cdb1d9b0655b058619fc27a8847d3dc8e9ba7b6fcbb436d5e7cda0547a"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.702773 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eee99f54-a76f-416d-a14f-cebf9d11548b","Type":"ContainerStarted","Data":"0c9edc6b8da1c8f4395f86c0760b11a1d0e50157fc2e91f2b5e58852148640c8"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.708180 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ce9dad6-2fa1-48f8-bd79-b114097ef3be","Type":"ContainerStarted","Data":"b9fc8758fc342d9ddd2492536ce75a17bb4e2d46aed7531a7ca328ee9aa7ab44"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.718384 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbp92" event={"ID":"044c141c-5c54-4e8c-a592-497a22f6f4db","Type":"ContainerStarted","Data":"a6d303f1b98c98fb5b59b7b20965447a9d940333ef1b4f7fa839403f531657c6"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.721495 4918 generic.go:334] "Generic (PLEG): container finished" podID="46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" containerID="04dca1c2f0729c514c69aa2cc8c24d4ed9a77f2d56bbeec392ed324b8420f084" exitCode=0 Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.721738 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" event={"ID":"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927","Type":"ContainerDied","Data":"04dca1c2f0729c514c69aa2cc8c24d4ed9a77f2d56bbeec392ed324b8420f084"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.721835 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" event={"ID":"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927","Type":"ContainerStarted","Data":"e195ebdc40227373c433e88d8a467c0b606ef63cf17581b323f84fad4aebd42b"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.732853 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-99gbh" event={"ID":"e24181b4-a2be-4ea7-9602-e3e16b8862c1","Type":"ContainerStarted","Data":"a5609c1803193b430e36071ecf6281a611ec66b983c0fdc16c41cd396d0ef3b0"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.742742 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd699b55c-ncb4d" event={"ID":"9ab5151a-6a64-47a2-8e0b-47455e4f66b0","Type":"ContainerStarted","Data":"e0e61af886f8be875324ab3e9decf415ea43b9983c91df92048cb302351053e0"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.742785 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd699b55c-ncb4d" event={"ID":"9ab5151a-6a64-47a2-8e0b-47455e4f66b0","Type":"ContainerStarted","Data":"86444fbf288072cbc7605672ab34c69a514a468427d17d4440d9af0adb1909ed"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.760973 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d659099f-7a05-4f1c-a097-67ecce42275d","Type":"ContainerStarted","Data":"148a6aa0e62a3e04666a50eb5f97678441eb7e98de6cd8cd34b41fcf9e2708d4"} Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.761006 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d659099f-7a05-4f1c-a097-67ecce42275d","Type":"ContainerStarted","Data":"5965fde4c7a481302c772ba776ff2a361688359ded9c8328331730d57024f30d"} Mar 19 17:00:34 crc kubenswrapper[4918]: E0319 17:00:34.761764 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-d9qg5" podUID="a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.771270 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-99gbh" podStartSLOduration=12.984581475 podStartE2EDuration="38.771252704s" podCreationTimestamp="2026-03-19 16:59:56 +0000 UTC" firstStartedPulling="2026-03-19 16:59:58.527006066 +0000 UTC m=+1210.649205314" lastFinishedPulling="2026-03-19 17:00:24.313677295 +0000 UTC m=+1236.435876543" observedRunningTime="2026-03-19 17:00:34.767748699 +0000 UTC m=+1246.889947947" watchObservedRunningTime="2026-03-19 17:00:34.771252704 +0000 UTC m=+1246.893451952" Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.771875 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xbp92" podStartSLOduration=26.771868052 podStartE2EDuration="26.771868052s" podCreationTimestamp="2026-03-19 17:00:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:34.742929609 +0000 UTC m=+1246.865128857" watchObservedRunningTime="2026-03-19 17:00:34.771868052 +0000 UTC m=+1246.894067300" Mar 19 17:00:34 crc kubenswrapper[4918]: I0319 17:00:34.850233 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=30.850212468 podStartE2EDuration="30.850212468s" podCreationTimestamp="2026-03-19 17:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:34.819752643 +0000 UTC m=+1246.941951891" watchObservedRunningTime="2026-03-19 17:00:34.850212468 +0000 UTC m=+1246.972411716" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.096233 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.096645 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.096664 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.096697 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.163161 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.166664 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.773461 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eee99f54-a76f-416d-a14f-cebf9d11548b","Type":"ContainerStarted","Data":"dc38a6c1fb8ac5da62997f1e892f5e7db87834e40181a874ebadb17fb75aa042"} Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.777747 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd699b55c-ncb4d" event={"ID":"9ab5151a-6a64-47a2-8e0b-47455e4f66b0","Type":"ContainerStarted","Data":"a3edc838142999c283b25cfa30ca8d6c385132c53b6daac34e06c8a93bb03836"} Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.778134 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.788115 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" event={"ID":"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927","Type":"ContainerStarted","Data":"50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f"} Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.788159 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.791952 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb7fd774d-vnxdq" event={"ID":"0825706f-8acf-485c-82dd-d5c672b187e8","Type":"ContainerStarted","Data":"048285cd1e307a5250686ca0ca9b49aea5077e3a65801edc8ea037c862608c52"} Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.791994 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.804606 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=23.80458445 podStartE2EDuration="23.80458445s" podCreationTimestamp="2026-03-19 17:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:35.797490806 +0000 UTC m=+1247.919690054" watchObservedRunningTime="2026-03-19 17:00:35.80458445 +0000 UTC m=+1247.926783698" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.835591 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5bb7fd774d-vnxdq" podStartSLOduration=9.835571899 podStartE2EDuration="9.835571899s" podCreationTimestamp="2026-03-19 17:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:35.816576239 +0000 UTC m=+1247.938775497" watchObservedRunningTime="2026-03-19 17:00:35.835571899 +0000 UTC m=+1247.957771147" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.849502 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bd699b55c-ncb4d" podStartSLOduration=7.84947814 podStartE2EDuration="7.84947814s" podCreationTimestamp="2026-03-19 17:00:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:35.840865864 +0000 UTC m=+1247.963065112" watchObservedRunningTime="2026-03-19 17:00:35.84947814 +0000 UTC m=+1247.971677388" Mar 19 17:00:35 crc kubenswrapper[4918]: I0319 17:00:35.874939 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" podStartSLOduration=9.874921326 podStartE2EDuration="9.874921326s" podCreationTimestamp="2026-03-19 17:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:35.866321461 +0000 UTC m=+1247.988520719" watchObservedRunningTime="2026-03-19 17:00:35.874921326 +0000 UTC m=+1247.997120574" Mar 19 17:00:36 crc kubenswrapper[4918]: I0319 17:00:36.230478 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565660-72kgl" Mar 19 17:00:36 crc kubenswrapper[4918]: I0319 17:00:36.355831 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqdxn\" (UniqueName: \"kubernetes.io/projected/09b359fb-3f74-447c-a755-338a558fc429-kube-api-access-pqdxn\") pod \"09b359fb-3f74-447c-a755-338a558fc429\" (UID: \"09b359fb-3f74-447c-a755-338a558fc429\") " Mar 19 17:00:36 crc kubenswrapper[4918]: I0319 17:00:36.362770 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b359fb-3f74-447c-a755-338a558fc429-kube-api-access-pqdxn" (OuterVolumeSpecName: "kube-api-access-pqdxn") pod "09b359fb-3f74-447c-a755-338a558fc429" (UID: "09b359fb-3f74-447c-a755-338a558fc429"). InnerVolumeSpecName "kube-api-access-pqdxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:36 crc kubenswrapper[4918]: I0319 17:00:36.459299 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqdxn\" (UniqueName: \"kubernetes.io/projected/09b359fb-3f74-447c-a755-338a558fc429-kube-api-access-pqdxn\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:36 crc kubenswrapper[4918]: I0319 17:00:36.801662 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565660-72kgl" Mar 19 17:00:36 crc kubenswrapper[4918]: I0319 17:00:36.801681 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565660-72kgl" event={"ID":"09b359fb-3f74-447c-a755-338a558fc429","Type":"ContainerDied","Data":"32acbf44079890c23f642a663e3eebc3f20592819da4e482d8efb08cd51261ab"} Mar 19 17:00:36 crc kubenswrapper[4918]: I0319 17:00:36.801913 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32acbf44079890c23f642a663e3eebc3f20592819da4e482d8efb08cd51261ab" Mar 19 17:00:37 crc kubenswrapper[4918]: I0319 17:00:37.325289 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565654-k4nkc"] Mar 19 17:00:37 crc kubenswrapper[4918]: I0319 17:00:37.333637 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565654-k4nkc"] Mar 19 17:00:37 crc kubenswrapper[4918]: I0319 17:00:37.820270 4918 generic.go:334] "Generic (PLEG): container finished" podID="e24181b4-a2be-4ea7-9602-e3e16b8862c1" containerID="a5609c1803193b430e36071ecf6281a611ec66b983c0fdc16c41cd396d0ef3b0" exitCode=0 Mar 19 17:00:37 crc kubenswrapper[4918]: I0319 17:00:37.820363 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-99gbh" event={"ID":"e24181b4-a2be-4ea7-9602-e3e16b8862c1","Type":"ContainerDied","Data":"a5609c1803193b430e36071ecf6281a611ec66b983c0fdc16c41cd396d0ef3b0"} Mar 19 17:00:37 crc kubenswrapper[4918]: I0319 17:00:37.824461 4918 generic.go:334] "Generic (PLEG): container finished" podID="044c141c-5c54-4e8c-a592-497a22f6f4db" containerID="a6d303f1b98c98fb5b59b7b20965447a9d940333ef1b4f7fa839403f531657c6" exitCode=0 Mar 19 17:00:37 crc kubenswrapper[4918]: I0319 17:00:37.824555 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbp92" event={"ID":"044c141c-5c54-4e8c-a592-497a22f6f4db","Type":"ContainerDied","Data":"a6d303f1b98c98fb5b59b7b20965447a9d940333ef1b4f7fa839403f531657c6"} Mar 19 17:00:38 crc kubenswrapper[4918]: I0319 17:00:38.615971 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d407a1d-5f74-4628-bf4e-47e9fad34bb5" path="/var/lib/kubelet/pods/7d407a1d-5f74-4628-bf4e-47e9fad34bb5/volumes" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.413266 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.433551 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-99gbh" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.527560 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-config-data\") pod \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.527614 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsx7l\" (UniqueName: \"kubernetes.io/projected/e24181b4-a2be-4ea7-9602-e3e16b8862c1-kube-api-access-hsx7l\") pod \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.527688 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-scripts\") pod \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.527716 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e24181b4-a2be-4ea7-9602-e3e16b8862c1-logs\") pod \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.527744 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-scripts\") pod \"044c141c-5c54-4e8c-a592-497a22f6f4db\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.527780 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4ssp\" (UniqueName: \"kubernetes.io/projected/044c141c-5c54-4e8c-a592-497a22f6f4db-kube-api-access-z4ssp\") pod \"044c141c-5c54-4e8c-a592-497a22f6f4db\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.527852 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-fernet-keys\") pod \"044c141c-5c54-4e8c-a592-497a22f6f4db\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.527880 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-credential-keys\") pod \"044c141c-5c54-4e8c-a592-497a22f6f4db\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.527906 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-combined-ca-bundle\") pod \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\" (UID: \"e24181b4-a2be-4ea7-9602-e3e16b8862c1\") " Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.527971 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-config-data\") pod \"044c141c-5c54-4e8c-a592-497a22f6f4db\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.527998 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-combined-ca-bundle\") pod \"044c141c-5c54-4e8c-a592-497a22f6f4db\" (UID: \"044c141c-5c54-4e8c-a592-497a22f6f4db\") " Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.544449 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "044c141c-5c54-4e8c-a592-497a22f6f4db" (UID: "044c141c-5c54-4e8c-a592-497a22f6f4db"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.544901 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e24181b4-a2be-4ea7-9602-e3e16b8862c1-logs" (OuterVolumeSpecName: "logs") pod "e24181b4-a2be-4ea7-9602-e3e16b8862c1" (UID: "e24181b4-a2be-4ea7-9602-e3e16b8862c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.546612 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-scripts" (OuterVolumeSpecName: "scripts") pod "e24181b4-a2be-4ea7-9602-e3e16b8862c1" (UID: "e24181b4-a2be-4ea7-9602-e3e16b8862c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.562692 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "044c141c-5c54-4e8c-a592-497a22f6f4db" (UID: "044c141c-5c54-4e8c-a592-497a22f6f4db"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.572714 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24181b4-a2be-4ea7-9602-e3e16b8862c1-kube-api-access-hsx7l" (OuterVolumeSpecName: "kube-api-access-hsx7l") pod "e24181b4-a2be-4ea7-9602-e3e16b8862c1" (UID: "e24181b4-a2be-4ea7-9602-e3e16b8862c1"). InnerVolumeSpecName "kube-api-access-hsx7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.577597 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/044c141c-5c54-4e8c-a592-497a22f6f4db-kube-api-access-z4ssp" (OuterVolumeSpecName: "kube-api-access-z4ssp") pod "044c141c-5c54-4e8c-a592-497a22f6f4db" (UID: "044c141c-5c54-4e8c-a592-497a22f6f4db"). InnerVolumeSpecName "kube-api-access-z4ssp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.597665 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-scripts" (OuterVolumeSpecName: "scripts") pod "044c141c-5c54-4e8c-a592-497a22f6f4db" (UID: "044c141c-5c54-4e8c-a592-497a22f6f4db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.623787 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-config-data" (OuterVolumeSpecName: "config-data") pod "044c141c-5c54-4e8c-a592-497a22f6f4db" (UID: "044c141c-5c54-4e8c-a592-497a22f6f4db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.646711 4918 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.646746 4918 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.646759 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.646768 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsx7l\" (UniqueName: \"kubernetes.io/projected/e24181b4-a2be-4ea7-9602-e3e16b8862c1-kube-api-access-hsx7l\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.646779 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.646788 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e24181b4-a2be-4ea7-9602-e3e16b8862c1-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.646796 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.646803 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4ssp\" (UniqueName: \"kubernetes.io/projected/044c141c-5c54-4e8c-a592-497a22f6f4db-kube-api-access-z4ssp\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.666671 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-config-data" (OuterVolumeSpecName: "config-data") pod "e24181b4-a2be-4ea7-9602-e3e16b8862c1" (UID: "e24181b4-a2be-4ea7-9602-e3e16b8862c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.683097 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "044c141c-5c54-4e8c-a592-497a22f6f4db" (UID: "044c141c-5c54-4e8c-a592-497a22f6f4db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.696555 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e24181b4-a2be-4ea7-9602-e3e16b8862c1" (UID: "e24181b4-a2be-4ea7-9602-e3e16b8862c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.750132 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.750167 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044c141c-5c54-4e8c-a592-497a22f6f4db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.750179 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e24181b4-a2be-4ea7-9602-e3e16b8862c1-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.846302 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbp92" event={"ID":"044c141c-5c54-4e8c-a592-497a22f6f4db","Type":"ContainerDied","Data":"083be2d423d77403f814126a1da7f9970738f1959b0b98d94e347a989fd5843e"} Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.846423 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="083be2d423d77403f814126a1da7f9970738f1959b0b98d94e347a989fd5843e" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.846327 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbp92" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.848290 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-99gbh" event={"ID":"e24181b4-a2be-4ea7-9602-e3e16b8862c1","Type":"ContainerDied","Data":"12fc32b0de6c67270b30670ebe7586b686dc46d45bd0b5a4822050fa566f911e"} Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.848348 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12fc32b0de6c67270b30670ebe7586b686dc46d45bd0b5a4822050fa566f911e" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.848485 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-99gbh" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.987788 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76c9778f96-f8hwv"] Mar 19 17:00:39 crc kubenswrapper[4918]: E0319 17:00:39.988994 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24181b4-a2be-4ea7-9602-e3e16b8862c1" containerName="placement-db-sync" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.989104 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24181b4-a2be-4ea7-9602-e3e16b8862c1" containerName="placement-db-sync" Mar 19 17:00:39 crc kubenswrapper[4918]: E0319 17:00:39.989203 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044c141c-5c54-4e8c-a592-497a22f6f4db" containerName="keystone-bootstrap" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.989274 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="044c141c-5c54-4e8c-a592-497a22f6f4db" containerName="keystone-bootstrap" Mar 19 17:00:39 crc kubenswrapper[4918]: E0319 17:00:39.989374 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b359fb-3f74-447c-a755-338a558fc429" containerName="oc" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.989483 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b359fb-3f74-447c-a755-338a558fc429" containerName="oc" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.989844 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="044c141c-5c54-4e8c-a592-497a22f6f4db" containerName="keystone-bootstrap" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.989950 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b359fb-3f74-447c-a755-338a558fc429" containerName="oc" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.990036 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24181b4-a2be-4ea7-9602-e3e16b8862c1" containerName="placement-db-sync" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.991381 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.993404 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.993575 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rw857" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.994114 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.994503 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 17:00:39 crc kubenswrapper[4918]: I0319 17:00:39.995295 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.002577 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76c9778f96-f8hwv"] Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.056536 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-568c4fd78c-t5k2q"] Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.058387 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.064011 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.064069 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.064361 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d72v8" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.064512 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.064722 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.064761 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.090339 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-568c4fd78c-t5k2q"] Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159016 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-internal-tls-certs\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159069 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-config-data\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159261 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-combined-ca-bundle\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159379 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7sr\" (UniqueName: \"kubernetes.io/projected/82332b68-a377-45a5-bf3c-d97caa5733ff-kube-api-access-bg7sr\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159499 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-public-tls-certs\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159578 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-internal-tls-certs\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159605 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-combined-ca-bundle\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159681 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-credential-keys\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159747 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xkdd\" (UniqueName: \"kubernetes.io/projected/156aeae6-d08f-48d3-a43a-63edfaad7860-kube-api-access-7xkdd\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159771 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-public-tls-certs\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159889 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82332b68-a377-45a5-bf3c-d97caa5733ff-logs\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159922 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-fernet-keys\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159951 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-config-data\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.159999 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-scripts\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.160023 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-scripts\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.261704 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-internal-tls-certs\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.261758 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-config-data\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.261802 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-combined-ca-bundle\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.261833 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7sr\" (UniqueName: \"kubernetes.io/projected/82332b68-a377-45a5-bf3c-d97caa5733ff-kube-api-access-bg7sr\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.261870 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-public-tls-certs\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.261895 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-internal-tls-certs\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.261913 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-combined-ca-bundle\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.261949 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-credential-keys\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.261971 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xkdd\" (UniqueName: \"kubernetes.io/projected/156aeae6-d08f-48d3-a43a-63edfaad7860-kube-api-access-7xkdd\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.261989 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-public-tls-certs\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.262023 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82332b68-a377-45a5-bf3c-d97caa5733ff-logs\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.262040 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-fernet-keys\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.262061 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-config-data\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.262086 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-scripts\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.262103 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-scripts\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.263138 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82332b68-a377-45a5-bf3c-d97caa5733ff-logs\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.270366 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-fernet-keys\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.270813 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-internal-tls-certs\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.270865 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-internal-tls-certs\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.271400 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-combined-ca-bundle\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.274961 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-public-tls-certs\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.275060 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-scripts\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.275129 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-credential-keys\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.275337 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-combined-ca-bundle\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.275470 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-scripts\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.277368 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-config-data\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.281801 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-public-tls-certs\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.284519 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xkdd\" (UniqueName: \"kubernetes.io/projected/156aeae6-d08f-48d3-a43a-63edfaad7860-kube-api-access-7xkdd\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.285643 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7sr\" (UniqueName: \"kubernetes.io/projected/82332b68-a377-45a5-bf3c-d97caa5733ff-kube-api-access-bg7sr\") pod \"placement-76c9778f96-f8hwv\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.287468 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156aeae6-d08f-48d3-a43a-63edfaad7860-config-data\") pod \"keystone-568c4fd78c-t5k2q\" (UID: \"156aeae6-d08f-48d3-a43a-63edfaad7860\") " pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.308598 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:40 crc kubenswrapper[4918]: I0319 17:00:40.386185 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:41 crc kubenswrapper[4918]: I0319 17:00:41.368684 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:00:41 crc kubenswrapper[4918]: I0319 17:00:41.433954 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lnhw2"] Mar 19 17:00:41 crc kubenswrapper[4918]: I0319 17:00:41.434227 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" podUID="1fdb44cf-8ddb-4561-8749-702ccf333279" containerName="dnsmasq-dns" containerID="cri-o://04294a5366c846d1d2758c9922edf070292f663c14dd49015287db01f707b159" gracePeriod=10 Mar 19 17:00:41 crc kubenswrapper[4918]: I0319 17:00:41.891097 4918 generic.go:334] "Generic (PLEG): container finished" podID="1fdb44cf-8ddb-4561-8749-702ccf333279" containerID="04294a5366c846d1d2758c9922edf070292f663c14dd49015287db01f707b159" exitCode=0 Mar 19 17:00:41 crc kubenswrapper[4918]: I0319 17:00:41.891157 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" event={"ID":"1fdb44cf-8ddb-4561-8749-702ccf333279","Type":"ContainerDied","Data":"04294a5366c846d1d2758c9922edf070292f663c14dd49015287db01f707b159"} Mar 19 17:00:43 crc kubenswrapper[4918]: I0319 17:00:43.052247 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 17:00:43 crc kubenswrapper[4918]: I0319 17:00:43.052783 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 17:00:43 crc kubenswrapper[4918]: I0319 17:00:43.052800 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 17:00:43 crc kubenswrapper[4918]: I0319 17:00:43.052813 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 17:00:43 crc kubenswrapper[4918]: I0319 17:00:43.102806 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 17:00:43 crc kubenswrapper[4918]: I0319 17:00:43.107948 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 17:00:43 crc kubenswrapper[4918]: I0319 17:00:43.943407 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" event={"ID":"1fdb44cf-8ddb-4561-8749-702ccf333279","Type":"ContainerDied","Data":"f79fe307868901c8a6d5e0e93ebc8b1d12764f5b89376df96929a983ae1e89ec"} Mar 19 17:00:43 crc kubenswrapper[4918]: I0319 17:00:43.943752 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f79fe307868901c8a6d5e0e93ebc8b1d12764f5b89376df96929a983ae1e89ec" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.041069 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.160770 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-sb\") pod \"1fdb44cf-8ddb-4561-8749-702ccf333279\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.160818 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-swift-storage-0\") pod \"1fdb44cf-8ddb-4561-8749-702ccf333279\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.160869 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz6ln\" (UniqueName: \"kubernetes.io/projected/1fdb44cf-8ddb-4561-8749-702ccf333279-kube-api-access-mz6ln\") pod \"1fdb44cf-8ddb-4561-8749-702ccf333279\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.160923 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-config\") pod \"1fdb44cf-8ddb-4561-8749-702ccf333279\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.161002 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-svc\") pod \"1fdb44cf-8ddb-4561-8749-702ccf333279\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.161045 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-nb\") pod \"1fdb44cf-8ddb-4561-8749-702ccf333279\" (UID: \"1fdb44cf-8ddb-4561-8749-702ccf333279\") " Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.167296 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fdb44cf-8ddb-4561-8749-702ccf333279-kube-api-access-mz6ln" (OuterVolumeSpecName: "kube-api-access-mz6ln") pod "1fdb44cf-8ddb-4561-8749-702ccf333279" (UID: "1fdb44cf-8ddb-4561-8749-702ccf333279"). InnerVolumeSpecName "kube-api-access-mz6ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.234281 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-config" (OuterVolumeSpecName: "config") pod "1fdb44cf-8ddb-4561-8749-702ccf333279" (UID: "1fdb44cf-8ddb-4561-8749-702ccf333279"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.239988 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1fdb44cf-8ddb-4561-8749-702ccf333279" (UID: "1fdb44cf-8ddb-4561-8749-702ccf333279"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.251158 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fdb44cf-8ddb-4561-8749-702ccf333279" (UID: "1fdb44cf-8ddb-4561-8749-702ccf333279"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.251824 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fdb44cf-8ddb-4561-8749-702ccf333279" (UID: "1fdb44cf-8ddb-4561-8749-702ccf333279"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.260381 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fdb44cf-8ddb-4561-8749-702ccf333279" (UID: "1fdb44cf-8ddb-4561-8749-702ccf333279"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.263719 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.263752 4918 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.263768 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz6ln\" (UniqueName: \"kubernetes.io/projected/1fdb44cf-8ddb-4561-8749-702ccf333279-kube-api-access-mz6ln\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.263857 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.263868 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.263880 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fdb44cf-8ddb-4561-8749-702ccf333279-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.328499 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76c9778f96-f8hwv"] Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.347263 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-568c4fd78c-t5k2q"] Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.958665 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-568c4fd78c-t5k2q" event={"ID":"156aeae6-d08f-48d3-a43a-63edfaad7860","Type":"ContainerStarted","Data":"b9cc8fa28695187111b03439b3d6ba37d014329d05d59c5aabc7ad0881012bd7"} Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.959094 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.959104 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-568c4fd78c-t5k2q" event={"ID":"156aeae6-d08f-48d3-a43a-63edfaad7860","Type":"ContainerStarted","Data":"7f295faf9af504eb418361128cc2f415fd8f5d9982cca7f4fc9a3267076ef437"} Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.963869 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtk47" event={"ID":"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e","Type":"ContainerStarted","Data":"3adb850ada405a057d34bf29d1f2e7cadee6568d3b1228664b17581b178fdf5a"} Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.966146 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ce9dad6-2fa1-48f8-bd79-b114097ef3be","Type":"ContainerStarted","Data":"997a5f2cff248f4eab80aca914467bbd3c1604c4bce69e8ec57cffbc9a266a79"} Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.968296 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c9778f96-f8hwv" event={"ID":"82332b68-a377-45a5-bf3c-d97caa5733ff","Type":"ContainerStarted","Data":"6ee4fd45ec8ddd5c01c993937a6eafe96329d67e1e4ee246a13c2def520983be"} Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.968321 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c9778f96-f8hwv" event={"ID":"82332b68-a377-45a5-bf3c-d97caa5733ff","Type":"ContainerStarted","Data":"56cba120420d1dd05798e0a6c49de015a2196a387a7ea4e9e8f01e55c620f1be"} Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.968330 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c9778f96-f8hwv" event={"ID":"82332b68-a377-45a5-bf3c-d97caa5733ff","Type":"ContainerStarted","Data":"87d3b97f565b8e1cacb1df2caf07570d12d00ecd367d658e7891fbb510b1eae3"} Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.968380 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" Mar 19 17:00:44 crc kubenswrapper[4918]: I0319 17:00:44.993611 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-568c4fd78c-t5k2q" podStartSLOduration=4.99358883 podStartE2EDuration="4.99358883s" podCreationTimestamp="2026-03-19 17:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:44.985630661 +0000 UTC m=+1257.107829909" watchObservedRunningTime="2026-03-19 17:00:44.99358883 +0000 UTC m=+1257.115788078" Mar 19 17:00:45 crc kubenswrapper[4918]: I0319 17:00:45.006200 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lnhw2"] Mar 19 17:00:45 crc kubenswrapper[4918]: I0319 17:00:45.015105 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lnhw2"] Mar 19 17:00:45 crc kubenswrapper[4918]: I0319 17:00:45.985793 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:45 crc kubenswrapper[4918]: I0319 17:00:45.986891 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:00:46 crc kubenswrapper[4918]: I0319 17:00:46.007723 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wtk47" podStartSLOduration=5.035001084 podStartE2EDuration="50.007692928s" podCreationTimestamp="2026-03-19 16:59:56 +0000 UTC" firstStartedPulling="2026-03-19 16:59:58.81561659 +0000 UTC m=+1210.937815838" lastFinishedPulling="2026-03-19 17:00:43.788308434 +0000 UTC m=+1255.910507682" observedRunningTime="2026-03-19 17:00:45.016297241 +0000 UTC m=+1257.138496489" watchObservedRunningTime="2026-03-19 17:00:46.007692928 +0000 UTC m=+1258.129892186" Mar 19 17:00:46 crc kubenswrapper[4918]: I0319 17:00:46.012961 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76c9778f96-f8hwv" podStartSLOduration=7.012928342 podStartE2EDuration="7.012928342s" podCreationTimestamp="2026-03-19 17:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:46.004424039 +0000 UTC m=+1258.126623307" watchObservedRunningTime="2026-03-19 17:00:46.012928342 +0000 UTC m=+1258.135127610" Mar 19 17:00:46 crc kubenswrapper[4918]: I0319 17:00:46.601872 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fdb44cf-8ddb-4561-8749-702ccf333279" path="/var/lib/kubelet/pods/1fdb44cf-8ddb-4561-8749-702ccf333279/volumes" Mar 19 17:00:47 crc kubenswrapper[4918]: I0319 17:00:47.028073 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b5btd" event={"ID":"33152bb1-e526-420f-8dec-7ef80c68b47c","Type":"ContainerStarted","Data":"045157a57de8912063aed8fd4ca2742d4129523126e13f84e24127942f93f280"} Mar 19 17:00:47 crc kubenswrapper[4918]: I0319 17:00:47.050747 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-b5btd" podStartSLOduration=4.967684542 podStartE2EDuration="51.05073032s" podCreationTimestamp="2026-03-19 16:59:56 +0000 UTC" firstStartedPulling="2026-03-19 16:59:58.299762071 +0000 UTC m=+1210.421961319" lastFinishedPulling="2026-03-19 17:00:44.382807849 +0000 UTC m=+1256.505007097" observedRunningTime="2026-03-19 17:00:47.046193376 +0000 UTC m=+1259.168392624" watchObservedRunningTime="2026-03-19 17:00:47.05073032 +0000 UTC m=+1259.172929568" Mar 19 17:00:47 crc kubenswrapper[4918]: I0319 17:00:47.166962 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-lnhw2" podUID="1fdb44cf-8ddb-4561-8749-702ccf333279" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.165:5353: i/o timeout" Mar 19 17:00:48 crc kubenswrapper[4918]: I0319 17:00:48.039638 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-d9qg5" event={"ID":"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25","Type":"ContainerStarted","Data":"9c8e8272a1d763e99e059b1a8b10a04c8e73a5d42a2d6c1c37fd09bbc454dfb3"} Mar 19 17:00:48 crc kubenswrapper[4918]: I0319 17:00:48.066753 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-d9qg5" podStartSLOduration=3.531005157 podStartE2EDuration="52.066728011s" podCreationTimestamp="2026-03-19 16:59:56 +0000 UTC" firstStartedPulling="2026-03-19 16:59:58.295463333 +0000 UTC m=+1210.417662581" lastFinishedPulling="2026-03-19 17:00:46.831186187 +0000 UTC m=+1258.953385435" observedRunningTime="2026-03-19 17:00:48.059589565 +0000 UTC m=+1260.181788823" watchObservedRunningTime="2026-03-19 17:00:48.066728011 +0000 UTC m=+1260.188927259" Mar 19 17:00:48 crc kubenswrapper[4918]: I0319 17:00:48.171195 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 17:00:48 crc kubenswrapper[4918]: I0319 17:00:48.171316 4918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 17:00:48 crc kubenswrapper[4918]: I0319 17:00:48.191149 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 17:00:49 crc kubenswrapper[4918]: I0319 17:00:49.055501 4918 generic.go:334] "Generic (PLEG): container finished" podID="ec2f9e01-6e64-4c5d-93d4-8428ae776a4e" containerID="3adb850ada405a057d34bf29d1f2e7cadee6568d3b1228664b17581b178fdf5a" exitCode=0 Mar 19 17:00:49 crc kubenswrapper[4918]: I0319 17:00:49.055555 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtk47" event={"ID":"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e","Type":"ContainerDied","Data":"3adb850ada405a057d34bf29d1f2e7cadee6568d3b1228664b17581b178fdf5a"} Mar 19 17:00:51 crc kubenswrapper[4918]: I0319 17:00:51.087869 4918 scope.go:117] "RemoveContainer" containerID="0f00a8751911b4a10f3eb4db9177ebe9d088cb9b35e95d9a46190cac53dfe477" Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.087618 4918 generic.go:334] "Generic (PLEG): container finished" podID="33152bb1-e526-420f-8dec-7ef80c68b47c" containerID="045157a57de8912063aed8fd4ca2742d4129523126e13f84e24127942f93f280" exitCode=0 Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.087676 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b5btd" event={"ID":"33152bb1-e526-420f-8dec-7ef80c68b47c","Type":"ContainerDied","Data":"045157a57de8912063aed8fd4ca2742d4129523126e13f84e24127942f93f280"} Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.598708 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtk47" Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.683223 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-combined-ca-bundle\") pod \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.683348 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k96q5\" (UniqueName: \"kubernetes.io/projected/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-kube-api-access-k96q5\") pod \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.683414 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-db-sync-config-data\") pod \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\" (UID: \"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e\") " Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.689537 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-kube-api-access-k96q5" (OuterVolumeSpecName: "kube-api-access-k96q5") pod "ec2f9e01-6e64-4c5d-93d4-8428ae776a4e" (UID: "ec2f9e01-6e64-4c5d-93d4-8428ae776a4e"). InnerVolumeSpecName "kube-api-access-k96q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.690294 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ec2f9e01-6e64-4c5d-93d4-8428ae776a4e" (UID: "ec2f9e01-6e64-4c5d-93d4-8428ae776a4e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.713312 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec2f9e01-6e64-4c5d-93d4-8428ae776a4e" (UID: "ec2f9e01-6e64-4c5d-93d4-8428ae776a4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.787605 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.787646 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k96q5\" (UniqueName: \"kubernetes.io/projected/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-kube-api-access-k96q5\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:52 crc kubenswrapper[4918]: I0319 17:00:52.787659 4918 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.097402 4918 generic.go:334] "Generic (PLEG): container finished" podID="a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" containerID="9c8e8272a1d763e99e059b1a8b10a04c8e73a5d42a2d6c1c37fd09bbc454dfb3" exitCode=0 Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.097463 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-d9qg5" event={"ID":"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25","Type":"ContainerDied","Data":"9c8e8272a1d763e99e059b1a8b10a04c8e73a5d42a2d6c1c37fd09bbc454dfb3"} Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.099113 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtk47" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.102733 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtk47" event={"ID":"ec2f9e01-6e64-4c5d-93d4-8428ae776a4e","Type":"ContainerDied","Data":"76c6bf5d9173aaf551c687f23f756433602c5db2930083eb2a208d4f32fa5c59"} Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.102795 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76c6bf5d9173aaf551c687f23f756433602c5db2930083eb2a208d4f32fa5c59" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.565470 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b5btd" Mar 19 17:00:53 crc kubenswrapper[4918]: E0319 17:00:53.671036 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.710888 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-db-sync-config-data\") pod \"33152bb1-e526-420f-8dec-7ef80c68b47c\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.710936 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-config-data\") pod \"33152bb1-e526-420f-8dec-7ef80c68b47c\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.711013 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btqj4\" (UniqueName: \"kubernetes.io/projected/33152bb1-e526-420f-8dec-7ef80c68b47c-kube-api-access-btqj4\") pod \"33152bb1-e526-420f-8dec-7ef80c68b47c\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.711054 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-combined-ca-bundle\") pod \"33152bb1-e526-420f-8dec-7ef80c68b47c\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.711096 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33152bb1-e526-420f-8dec-7ef80c68b47c-etc-machine-id\") pod \"33152bb1-e526-420f-8dec-7ef80c68b47c\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.711199 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-scripts\") pod \"33152bb1-e526-420f-8dec-7ef80c68b47c\" (UID: \"33152bb1-e526-420f-8dec-7ef80c68b47c\") " Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.711489 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33152bb1-e526-420f-8dec-7ef80c68b47c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "33152bb1-e526-420f-8dec-7ef80c68b47c" (UID: "33152bb1-e526-420f-8dec-7ef80c68b47c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.711748 4918 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33152bb1-e526-420f-8dec-7ef80c68b47c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.714731 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33152bb1-e526-420f-8dec-7ef80c68b47c-kube-api-access-btqj4" (OuterVolumeSpecName: "kube-api-access-btqj4") pod "33152bb1-e526-420f-8dec-7ef80c68b47c" (UID: "33152bb1-e526-420f-8dec-7ef80c68b47c"). InnerVolumeSpecName "kube-api-access-btqj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.718791 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "33152bb1-e526-420f-8dec-7ef80c68b47c" (UID: "33152bb1-e526-420f-8dec-7ef80c68b47c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.720624 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-scripts" (OuterVolumeSpecName: "scripts") pod "33152bb1-e526-420f-8dec-7ef80c68b47c" (UID: "33152bb1-e526-420f-8dec-7ef80c68b47c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.737960 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33152bb1-e526-420f-8dec-7ef80c68b47c" (UID: "33152bb1-e526-420f-8dec-7ef80c68b47c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.776881 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-config-data" (OuterVolumeSpecName: "config-data") pod "33152bb1-e526-420f-8dec-7ef80c68b47c" (UID: "33152bb1-e526-420f-8dec-7ef80c68b47c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.813332 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.813370 4918 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.813382 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.813392 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btqj4\" (UniqueName: \"kubernetes.io/projected/33152bb1-e526-420f-8dec-7ef80c68b47c-kube-api-access-btqj4\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.813402 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33152bb1-e526-420f-8dec-7ef80c68b47c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.877597 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6649d8ff59-m8z7j"] Mar 19 17:00:53 crc kubenswrapper[4918]: E0319 17:00:53.878076 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33152bb1-e526-420f-8dec-7ef80c68b47c" containerName="cinder-db-sync" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.878099 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="33152bb1-e526-420f-8dec-7ef80c68b47c" containerName="cinder-db-sync" Mar 19 17:00:53 crc kubenswrapper[4918]: E0319 17:00:53.878135 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdb44cf-8ddb-4561-8749-702ccf333279" containerName="init" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.878142 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdb44cf-8ddb-4561-8749-702ccf333279" containerName="init" Mar 19 17:00:53 crc kubenswrapper[4918]: E0319 17:00:53.878154 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdb44cf-8ddb-4561-8749-702ccf333279" containerName="dnsmasq-dns" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.878160 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdb44cf-8ddb-4561-8749-702ccf333279" containerName="dnsmasq-dns" Mar 19 17:00:53 crc kubenswrapper[4918]: E0319 17:00:53.878172 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec2f9e01-6e64-4c5d-93d4-8428ae776a4e" containerName="barbican-db-sync" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.878179 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec2f9e01-6e64-4c5d-93d4-8428ae776a4e" containerName="barbican-db-sync" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.878351 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec2f9e01-6e64-4c5d-93d4-8428ae776a4e" containerName="barbican-db-sync" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.878373 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="33152bb1-e526-420f-8dec-7ef80c68b47c" containerName="cinder-db-sync" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.878389 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fdb44cf-8ddb-4561-8749-702ccf333279" containerName="dnsmasq-dns" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.879494 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.882465 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qbhlj" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.882646 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.882755 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.895039 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6649d8ff59-m8z7j"] Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.961811 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5c845594d6-ndb6n"] Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.963459 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.966755 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 19 17:00:53 crc kubenswrapper[4918]: I0319 17:00:53.987750 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5c845594d6-ndb6n"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.003491 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s4tth"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.005163 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.016693 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wr5x\" (UniqueName: \"kubernetes.io/projected/1552f8f6-8143-42f2-882b-acead175ae14-kube-api-access-5wr5x\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.016788 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1552f8f6-8143-42f2-882b-acead175ae14-config-data-custom\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.016817 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1552f8f6-8143-42f2-882b-acead175ae14-config-data\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.016854 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1552f8f6-8143-42f2-882b-acead175ae14-combined-ca-bundle\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.016972 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1552f8f6-8143-42f2-882b-acead175ae14-logs\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.020277 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s4tth"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.093725 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-846b889554-z7r6b"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.095381 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.099696 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.114651 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-846b889554-z7r6b"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119163 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-config\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119204 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6ac727-0f36-47d5-a77d-e21f590089e1-config-data\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119233 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76f6v\" (UniqueName: \"kubernetes.io/projected/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-kube-api-access-76f6v\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119253 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e6ac727-0f36-47d5-a77d-e21f590089e1-config-data-custom\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119284 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119340 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1552f8f6-8143-42f2-882b-acead175ae14-logs\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119360 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119376 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wr5x\" (UniqueName: \"kubernetes.io/projected/1552f8f6-8143-42f2-882b-acead175ae14-kube-api-access-5wr5x\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119399 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e6ac727-0f36-47d5-a77d-e21f590089e1-logs\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119420 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpxgv\" (UniqueName: \"kubernetes.io/projected/0e6ac727-0f36-47d5-a77d-e21f590089e1-kube-api-access-kpxgv\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119452 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-svc\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119482 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1552f8f6-8143-42f2-882b-acead175ae14-config-data-custom\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119505 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1552f8f6-8143-42f2-882b-acead175ae14-config-data\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119527 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119581 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1552f8f6-8143-42f2-882b-acead175ae14-combined-ca-bundle\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.119610 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ac727-0f36-47d5-a77d-e21f590089e1-combined-ca-bundle\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.120243 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ce9dad6-2fa1-48f8-bd79-b114097ef3be","Type":"ContainerStarted","Data":"a01d3d35daa0cb9badd3c202d3b6dc2d71a3eb216dbf83b77e4282abdb743fb1"} Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.120302 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1552f8f6-8143-42f2-882b-acead175ae14-logs\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.120388 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="ceilometer-notification-agent" containerID="cri-o://b9fc8758fc342d9ddd2492536ce75a17bb4e2d46aed7531a7ca328ee9aa7ab44" gracePeriod=30 Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.120590 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.120841 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="proxy-httpd" containerID="cri-o://a01d3d35daa0cb9badd3c202d3b6dc2d71a3eb216dbf83b77e4282abdb743fb1" gracePeriod=30 Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.120889 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="sg-core" containerID="cri-o://997a5f2cff248f4eab80aca914467bbd3c1604c4bce69e8ec57cffbc9a266a79" gracePeriod=30 Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.129244 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1552f8f6-8143-42f2-882b-acead175ae14-config-data\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.136794 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1552f8f6-8143-42f2-882b-acead175ae14-combined-ca-bundle\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.139091 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b5btd" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.141528 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b5btd" event={"ID":"33152bb1-e526-420f-8dec-7ef80c68b47c","Type":"ContainerDied","Data":"7a5fb1a58f9b5bdf45775c9843b2e4f45ea7157e76ac738a453272c7f5bb514b"} Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.141592 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a5fb1a58f9b5bdf45775c9843b2e4f45ea7157e76ac738a453272c7f5bb514b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.154081 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1552f8f6-8143-42f2-882b-acead175ae14-config-data-custom\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.161347 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wr5x\" (UniqueName: \"kubernetes.io/projected/1552f8f6-8143-42f2-882b-acead175ae14-kube-api-access-5wr5x\") pod \"barbican-worker-6649d8ff59-m8z7j\" (UID: \"1552f8f6-8143-42f2-882b-acead175ae14\") " pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.212681 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6649d8ff59-m8z7j" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225264 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225321 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e6ac727-0f36-47d5-a77d-e21f590089e1-logs\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225349 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpxgv\" (UniqueName: \"kubernetes.io/projected/0e6ac727-0f36-47d5-a77d-e21f590089e1-kube-api-access-kpxgv\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225380 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf259f2d-395e-4d36-bdc0-2c01310e24e8-logs\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225408 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-svc\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225460 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225521 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ac727-0f36-47d5-a77d-e21f590089e1-combined-ca-bundle\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225559 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data-custom\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225587 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6ac727-0f36-47d5-a77d-e21f590089e1-config-data\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225609 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-config\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225634 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76f6v\" (UniqueName: \"kubernetes.io/projected/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-kube-api-access-76f6v\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225657 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e6ac727-0f36-47d5-a77d-e21f590089e1-config-data-custom\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225681 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-combined-ca-bundle\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225715 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225743 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkqq7\" (UniqueName: \"kubernetes.io/projected/bf259f2d-395e-4d36-bdc0-2c01310e24e8-kube-api-access-lkqq7\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.225788 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.226328 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e6ac727-0f36-47d5-a77d-e21f590089e1-logs\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.226817 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.227218 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-svc\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.227775 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.228059 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.232817 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ac727-0f36-47d5-a77d-e21f590089e1-combined-ca-bundle\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.233587 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e6ac727-0f36-47d5-a77d-e21f590089e1-config-data-custom\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.238960 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6ac727-0f36-47d5-a77d-e21f590089e1-config-data\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.242776 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-config\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.249780 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpxgv\" (UniqueName: \"kubernetes.io/projected/0e6ac727-0f36-47d5-a77d-e21f590089e1-kube-api-access-kpxgv\") pod \"barbican-keystone-listener-5c845594d6-ndb6n\" (UID: \"0e6ac727-0f36-47d5-a77d-e21f590089e1\") " pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.251914 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76f6v\" (UniqueName: \"kubernetes.io/projected/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-kube-api-access-76f6v\") pod \"dnsmasq-dns-85ff748b95-s4tth\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.293758 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.327874 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkqq7\" (UniqueName: \"kubernetes.io/projected/bf259f2d-395e-4d36-bdc0-2c01310e24e8-kube-api-access-lkqq7\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.327949 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.328009 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf259f2d-395e-4d36-bdc0-2c01310e24e8-logs\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.328087 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data-custom\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.328121 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-combined-ca-bundle\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.329777 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf259f2d-395e-4d36-bdc0-2c01310e24e8-logs\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.338106 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.346873 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data-custom\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.351970 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-combined-ca-bundle\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.353030 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.356359 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.357886 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.362268 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.362440 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.362459 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-h4t22" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.362564 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.362604 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkqq7\" (UniqueName: \"kubernetes.io/projected/bf259f2d-395e-4d36-bdc0-2c01310e24e8-kube-api-access-lkqq7\") pod \"barbican-api-846b889554-z7r6b\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.376925 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.426339 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.452793 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.453139 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8prb\" (UniqueName: \"kubernetes.io/projected/23bad2c2-b869-49db-9a5b-9dc2ad887973-kube-api-access-m8prb\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.453639 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-scripts\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.455345 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.455559 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.456123 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23bad2c2-b869-49db-9a5b-9dc2ad887973-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.567091 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.567392 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.569843 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23bad2c2-b869-49db-9a5b-9dc2ad887973-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.569976 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.569997 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8prb\" (UniqueName: \"kubernetes.io/projected/23bad2c2-b869-49db-9a5b-9dc2ad887973-kube-api-access-m8prb\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.570077 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-scripts\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.580125 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-scripts\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.580436 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23bad2c2-b869-49db-9a5b-9dc2ad887973-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.591800 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.600259 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.620732 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8prb\" (UniqueName: \"kubernetes.io/projected/23bad2c2-b869-49db-9a5b-9dc2ad887973-kube-api-access-m8prb\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.629804 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.634977 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s4tth"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.642096 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cntn9"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.646505 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.661506 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cntn9"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.690438 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.704922 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.712066 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.715192 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.722481 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.783823 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-config\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784058 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-scripts\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784101 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data-custom\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784123 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784153 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dddwb\" (UniqueName: \"kubernetes.io/projected/71b991d8-dce5-4482-90c2-b904a5f6eb0e-kube-api-access-dddwb\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784207 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784245 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/178a9ae2-1774-4025-8951-93167e95f5d7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784283 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784309 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784357 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/178a9ae2-1774-4025-8951-93167e95f5d7-logs\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784380 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784399 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw4nt\" (UniqueName: \"kubernetes.io/projected/178a9ae2-1774-4025-8951-93167e95f5d7-kube-api-access-nw4nt\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784447 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.784793 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.885799 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-scripts\") pod \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.885846 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt5j7\" (UniqueName: \"kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-kube-api-access-lt5j7\") pod \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886045 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-combined-ca-bundle\") pod \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886093 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-certs\") pod \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886114 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-config-data\") pod \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\" (UID: \"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25\") " Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886413 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886459 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/178a9ae2-1774-4025-8951-93167e95f5d7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886478 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886506 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886540 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/178a9ae2-1774-4025-8951-93167e95f5d7-logs\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886573 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886596 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw4nt\" (UniqueName: \"kubernetes.io/projected/178a9ae2-1774-4025-8951-93167e95f5d7-kube-api-access-nw4nt\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886633 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886684 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-config\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886709 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-scripts\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886722 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data-custom\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886744 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.886771 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dddwb\" (UniqueName: \"kubernetes.io/projected/71b991d8-dce5-4482-90c2-b904a5f6eb0e-kube-api-access-dddwb\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.887378 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/178a9ae2-1774-4025-8951-93167e95f5d7-logs\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.888002 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.888073 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/178a9ae2-1774-4025-8951-93167e95f5d7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.888867 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.889385 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.890284 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-config\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.894468 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-certs" (OuterVolumeSpecName: "certs") pod "a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" (UID: "a5df5afd-edbf-49fd-b9b8-35aa33fb5d25"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.895708 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.910315 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data-custom\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.910894 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-scripts" (OuterVolumeSpecName: "scripts") pod "a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" (UID: "a5df5afd-edbf-49fd-b9b8-35aa33fb5d25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.911561 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.912326 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-scripts\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.915117 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw4nt\" (UniqueName: \"kubernetes.io/projected/178a9ae2-1774-4025-8951-93167e95f5d7-kube-api-access-nw4nt\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.916351 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data\") pod \"cinder-api-0\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " pod="openstack/cinder-api-0" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.949348 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6649d8ff59-m8z7j"] Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.955346 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dddwb\" (UniqueName: \"kubernetes.io/projected/71b991d8-dce5-4482-90c2-b904a5f6eb0e-kube-api-access-dddwb\") pod \"dnsmasq-dns-5c9776ccc5-cntn9\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.955590 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-kube-api-access-lt5j7" (OuterVolumeSpecName: "kube-api-access-lt5j7") pod "a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" (UID: "a5df5afd-edbf-49fd-b9b8-35aa33fb5d25"). InnerVolumeSpecName "kube-api-access-lt5j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.972429 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-config-data" (OuterVolumeSpecName: "config-data") pod "a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" (UID: "a5df5afd-edbf-49fd-b9b8-35aa33fb5d25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.988342 4918 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.988370 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.988381 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.988391 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt5j7\" (UniqueName: \"kubernetes.io/projected/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-kube-api-access-lt5j7\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:54 crc kubenswrapper[4918]: I0319 17:00:54.997565 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" (UID: "a5df5afd-edbf-49fd-b9b8-35aa33fb5d25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.067330 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.090622 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.174447 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.182962 4918 generic.go:334] "Generic (PLEG): container finished" podID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerID="a01d3d35daa0cb9badd3c202d3b6dc2d71a3eb216dbf83b77e4282abdb743fb1" exitCode=0 Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.182990 4918 generic.go:334] "Generic (PLEG): container finished" podID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerID="997a5f2cff248f4eab80aca914467bbd3c1604c4bce69e8ec57cffbc9a266a79" exitCode=2 Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.183058 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ce9dad6-2fa1-48f8-bd79-b114097ef3be","Type":"ContainerDied","Data":"a01d3d35daa0cb9badd3c202d3b6dc2d71a3eb216dbf83b77e4282abdb743fb1"} Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.183082 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ce9dad6-2fa1-48f8-bd79-b114097ef3be","Type":"ContainerDied","Data":"997a5f2cff248f4eab80aca914467bbd3c1604c4bce69e8ec57cffbc9a266a79"} Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.184702 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6649d8ff59-m8z7j" event={"ID":"1552f8f6-8143-42f2-882b-acead175ae14","Type":"ContainerStarted","Data":"adb675d55163dda046d01a126524176ec22e800f7b3ad48532f0f78429f3d098"} Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.186138 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-d9qg5" event={"ID":"a5df5afd-edbf-49fd-b9b8-35aa33fb5d25","Type":"ContainerDied","Data":"263dfb542e1c7c9a0d33fea597df87e4895d3657e667255f196609788d2c1fc8"} Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.186157 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="263dfb542e1c7c9a0d33fea597df87e4895d3657e667255f196609788d2c1fc8" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.186202 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-d9qg5" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.246609 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-x77xk"] Mar 19 17:00:55 crc kubenswrapper[4918]: E0319 17:00:55.247090 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" containerName="cloudkitty-db-sync" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.247107 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" containerName="cloudkitty-db-sync" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.247285 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" containerName="cloudkitty-db-sync" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.255202 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.256873 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-x77xk"] Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.260106 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.260236 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.260370 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.260557 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-kxbs6" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.260732 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.285925 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-846b889554-z7r6b"] Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.299896 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5c845594d6-ndb6n"] Mar 19 17:00:55 crc kubenswrapper[4918]: W0319 17:00:55.318595 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e6ac727_0f36_47d5_a77d_e21f590089e1.slice/crio-718d6c8bfc598e4fc39210c8f6bd110d0c5ca2ca218db0d02243b89e1f3a7913 WatchSource:0}: Error finding container 718d6c8bfc598e4fc39210c8f6bd110d0c5ca2ca218db0d02243b89e1f3a7913: Status 404 returned error can't find the container with id 718d6c8bfc598e4fc39210c8f6bd110d0c5ca2ca218db0d02243b89e1f3a7913 Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.400312 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-scripts\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.400409 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-certs\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.400499 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8xgh\" (UniqueName: \"kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-kube-api-access-f8xgh\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.400529 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-config-data\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.400567 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-combined-ca-bundle\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.503724 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8xgh\" (UniqueName: \"kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-kube-api-access-f8xgh\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.504038 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-config-data\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.504072 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-combined-ca-bundle\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.504091 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-scripts\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.504155 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-certs\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.510533 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-config-data\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.511245 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-certs\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.526702 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-combined-ca-bundle\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.532990 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8xgh\" (UniqueName: \"kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-kube-api-access-f8xgh\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.535258 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-scripts\") pod \"cloudkitty-storageinit-x77xk\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.558891 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s4tth"] Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.587936 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.646829 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 17:00:55 crc kubenswrapper[4918]: W0319 17:00:55.660527 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23bad2c2_b869_49db_9a5b_9dc2ad887973.slice/crio-0ae331a3b7e514c3aee96909b2803ecc7e3bcd75ad542b444b37c5559f0a4497 WatchSource:0}: Error finding container 0ae331a3b7e514c3aee96909b2803ecc7e3bcd75ad542b444b37c5559f0a4497: Status 404 returned error can't find the container with id 0ae331a3b7e514c3aee96909b2803ecc7e3bcd75ad542b444b37c5559f0a4497 Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.699128 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cntn9"] Mar 19 17:00:55 crc kubenswrapper[4918]: I0319 17:00:55.980398 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 17:00:55 crc kubenswrapper[4918]: W0319 17:00:55.983045 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod178a9ae2_1774_4025_8951_93167e95f5d7.slice/crio-8d4cd2bf2da9f24b9204065572dc70094665bd2d473d0926a3f0a3754b766cc6 WatchSource:0}: Error finding container 8d4cd2bf2da9f24b9204065572dc70094665bd2d473d0926a3f0a3754b766cc6: Status 404 returned error can't find the container with id 8d4cd2bf2da9f24b9204065572dc70094665bd2d473d0926a3f0a3754b766cc6 Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.199460 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" event={"ID":"0e6ac727-0f36-47d5-a77d-e21f590089e1","Type":"ContainerStarted","Data":"718d6c8bfc598e4fc39210c8f6bd110d0c5ca2ca218db0d02243b89e1f3a7913"} Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.201412 4918 generic.go:334] "Generic (PLEG): container finished" podID="d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80" containerID="99c2d1f491197f61b645b067fa426f7a683469dde234c51f1c053c3c18c32443" exitCode=0 Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.201477 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-s4tth" event={"ID":"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80","Type":"ContainerDied","Data":"99c2d1f491197f61b645b067fa426f7a683469dde234c51f1c053c3c18c32443"} Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.201507 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-s4tth" event={"ID":"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80","Type":"ContainerStarted","Data":"729082bb0b25c93a46a5e28ed439c77c0659d9057648851703a559c928d44f90"} Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.203460 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"178a9ae2-1774-4025-8951-93167e95f5d7","Type":"ContainerStarted","Data":"8d4cd2bf2da9f24b9204065572dc70094665bd2d473d0926a3f0a3754b766cc6"} Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.204903 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23bad2c2-b869-49db-9a5b-9dc2ad887973","Type":"ContainerStarted","Data":"0ae331a3b7e514c3aee96909b2803ecc7e3bcd75ad542b444b37c5559f0a4497"} Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.205181 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-x77xk"] Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.206793 4918 generic.go:334] "Generic (PLEG): container finished" podID="71b991d8-dce5-4482-90c2-b904a5f6eb0e" containerID="a8dc59d0b0f4ac0b4166c0beef03bebc11a9a94e95da637e2eac609f2af93328" exitCode=0 Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.206974 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" event={"ID":"71b991d8-dce5-4482-90c2-b904a5f6eb0e","Type":"ContainerDied","Data":"a8dc59d0b0f4ac0b4166c0beef03bebc11a9a94e95da637e2eac609f2af93328"} Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.207015 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" event={"ID":"71b991d8-dce5-4482-90c2-b904a5f6eb0e","Type":"ContainerStarted","Data":"5ee5d73dcf5747ea0d10a83dd4185eda321f35d42b5feecffa6a78c342114ddc"} Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.209421 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-846b889554-z7r6b" event={"ID":"bf259f2d-395e-4d36-bdc0-2c01310e24e8","Type":"ContainerStarted","Data":"009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a"} Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.209473 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-846b889554-z7r6b" event={"ID":"bf259f2d-395e-4d36-bdc0-2c01310e24e8","Type":"ContainerStarted","Data":"53a649fe1b19158a0ab503c85e356bb06b06b415d2ad5505b01bbd49b1b164d1"} Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.209642 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.209661 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:00:56 crc kubenswrapper[4918]: W0319 17:00:56.239281 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84e41394_829e_4305_a4cf_0e35a37839a7.slice/crio-c2749d59e38821272f4da2004ce9bd89a3c912dfa7183db570d12a63633f896a WatchSource:0}: Error finding container c2749d59e38821272f4da2004ce9bd89a3c912dfa7183db570d12a63633f896a: Status 404 returned error can't find the container with id c2749d59e38821272f4da2004ce9bd89a3c912dfa7183db570d12a63633f896a Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.372432 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-846b889554-z7r6b" podStartSLOduration=2.372414974 podStartE2EDuration="2.372414974s" podCreationTimestamp="2026-03-19 17:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:56.28935822 +0000 UTC m=+1268.411557488" watchObservedRunningTime="2026-03-19 17:00:56.372414974 +0000 UTC m=+1268.494614222" Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.543029 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.867660 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.956751 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bd699b55c-ncb4d"] Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.956980 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bd699b55c-ncb4d" podUID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerName="neutron-api" containerID="cri-o://e0e61af886f8be875324ab3e9decf415ea43b9983c91df92048cb302351053e0" gracePeriod=30 Mar 19 17:00:56 crc kubenswrapper[4918]: I0319 17:00:56.957607 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bd699b55c-ncb4d" podUID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerName="neutron-httpd" containerID="cri-o://a3edc838142999c283b25cfa30ca8d6c385132c53b6daac34e06c8a93bb03836" gracePeriod=30 Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:56.997908 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.001870 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-sb\") pod \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.001921 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-nb\") pod \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.001964 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-swift-storage-0\") pod \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.002045 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76f6v\" (UniqueName: \"kubernetes.io/projected/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-kube-api-access-76f6v\") pod \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.002118 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-config\") pod \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.002138 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-svc\") pod \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\" (UID: \"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80\") " Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.019830 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-kube-api-access-76f6v" (OuterVolumeSpecName: "kube-api-access-76f6v") pod "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80" (UID: "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80"). InnerVolumeSpecName "kube-api-access-76f6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.031712 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80" (UID: "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.042240 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54486b455c-k7jwz"] Mar 19 17:00:57 crc kubenswrapper[4918]: E0319 17:00:57.042693 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80" containerName="init" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.042711 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80" containerName="init" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.042937 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80" containerName="init" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.046687 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.047463 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80" (UID: "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.048949 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54486b455c-k7jwz"] Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.099878 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80" (UID: "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.100214 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-config" (OuterVolumeSpecName: "config") pod "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80" (UID: "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.105198 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.105227 4918 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.105237 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76f6v\" (UniqueName: \"kubernetes.io/projected/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-kube-api-access-76f6v\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.105247 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.105255 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.139170 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80" (UID: "d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.206448 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-combined-ca-bundle\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.206496 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-config\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.206519 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-httpd-config\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.206571 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-internal-tls-certs\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.206609 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwc48\" (UniqueName: \"kubernetes.io/projected/f2211c01-104a-4847-8c59-bc11ff34169f-kube-api-access-nwc48\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.206686 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-ovndb-tls-certs\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.206722 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-public-tls-certs\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.206790 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.223204 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-846b889554-z7r6b" event={"ID":"bf259f2d-395e-4d36-bdc0-2c01310e24e8","Type":"ContainerStarted","Data":"f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206"} Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.226479 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-s4tth" event={"ID":"d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80","Type":"ContainerDied","Data":"729082bb0b25c93a46a5e28ed439c77c0659d9057648851703a559c928d44f90"} Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.226551 4918 scope.go:117] "RemoveContainer" containerID="99c2d1f491197f61b645b067fa426f7a683469dde234c51f1c053c3c18c32443" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.226720 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-s4tth" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.237729 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-x77xk" event={"ID":"84e41394-829e-4305-a4cf-0e35a37839a7","Type":"ContainerStarted","Data":"34b99b4915e9314909894d08437b2da10e33044f4dae5091a867bf7c7638986a"} Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.237777 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-x77xk" event={"ID":"84e41394-829e-4305-a4cf-0e35a37839a7","Type":"ContainerStarted","Data":"c2749d59e38821272f4da2004ce9bd89a3c912dfa7183db570d12a63633f896a"} Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.246374 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"178a9ae2-1774-4025-8951-93167e95f5d7","Type":"ContainerStarted","Data":"abfc046338e9ac7084231b22015e5bd9c6862373c4e995d47fd5e77fb3d0cf70"} Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.256635 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-x77xk" podStartSLOduration=2.256614505 podStartE2EDuration="2.256614505s" podCreationTimestamp="2026-03-19 17:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:57.253497149 +0000 UTC m=+1269.375696397" watchObservedRunningTime="2026-03-19 17:00:57.256614505 +0000 UTC m=+1269.378813743" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.262411 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" event={"ID":"71b991d8-dce5-4482-90c2-b904a5f6eb0e","Type":"ContainerStarted","Data":"29246a219f079b04c662a83e96f11fef94512295380083535a11df21ee713c1f"} Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.263213 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.312930 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwc48\" (UniqueName: \"kubernetes.io/projected/f2211c01-104a-4847-8c59-bc11ff34169f-kube-api-access-nwc48\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.313068 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-ovndb-tls-certs\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.313117 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-public-tls-certs\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.313197 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-combined-ca-bundle\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.313229 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-config\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.313250 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-httpd-config\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.313309 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-internal-tls-certs\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.317026 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-internal-tls-certs\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.321022 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-combined-ca-bundle\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.324376 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-httpd-config\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.328285 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-config\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.328413 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-public-tls-certs\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.334083 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2211c01-104a-4847-8c59-bc11ff34169f-ovndb-tls-certs\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.350067 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-bd699b55c-ncb4d" podUID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.177:9696/\": read tcp 10.217.0.2:42608->10.217.0.177:9696: read: connection reset by peer" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.364751 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s4tth"] Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.378432 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwc48\" (UniqueName: \"kubernetes.io/projected/f2211c01-104a-4847-8c59-bc11ff34169f-kube-api-access-nwc48\") pod \"neutron-54486b455c-k7jwz\" (UID: \"f2211c01-104a-4847-8c59-bc11ff34169f\") " pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.403580 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-s4tth"] Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.404381 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:00:57 crc kubenswrapper[4918]: I0319 17:00:57.423253 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" podStartSLOduration=3.423229699 podStartE2EDuration="3.423229699s" podCreationTimestamp="2026-03-19 17:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:57.337227473 +0000 UTC m=+1269.459426731" watchObservedRunningTime="2026-03-19 17:00:57.423229699 +0000 UTC m=+1269.545428947" Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.287101 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"178a9ae2-1774-4025-8951-93167e95f5d7","Type":"ContainerStarted","Data":"d3cb418db6fdfe2f8e5fcd5661364f8f04b064e798a1a7006adf3e06922b7e9e"} Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.287472 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="178a9ae2-1774-4025-8951-93167e95f5d7" containerName="cinder-api-log" containerID="cri-o://abfc046338e9ac7084231b22015e5bd9c6862373c4e995d47fd5e77fb3d0cf70" gracePeriod=30 Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.287581 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.287876 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="178a9ae2-1774-4025-8951-93167e95f5d7" containerName="cinder-api" containerID="cri-o://d3cb418db6fdfe2f8e5fcd5661364f8f04b064e798a1a7006adf3e06922b7e9e" gracePeriod=30 Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.333603 4918 generic.go:334] "Generic (PLEG): container finished" podID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerID="b9fc8758fc342d9ddd2492536ce75a17bb4e2d46aed7531a7ca328ee9aa7ab44" exitCode=0 Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.333670 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ce9dad6-2fa1-48f8-bd79-b114097ef3be","Type":"ContainerDied","Data":"b9fc8758fc342d9ddd2492536ce75a17bb4e2d46aed7531a7ca328ee9aa7ab44"} Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.334977 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.334950504 podStartE2EDuration="4.334950504s" podCreationTimestamp="2026-03-19 17:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:00:58.32716131 +0000 UTC m=+1270.449360558" watchObservedRunningTime="2026-03-19 17:00:58.334950504 +0000 UTC m=+1270.457149742" Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.356802 4918 generic.go:334] "Generic (PLEG): container finished" podID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerID="a3edc838142999c283b25cfa30ca8d6c385132c53b6daac34e06c8a93bb03836" exitCode=0 Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.356936 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd699b55c-ncb4d" event={"ID":"9ab5151a-6a64-47a2-8e0b-47455e4f66b0","Type":"ContainerDied","Data":"a3edc838142999c283b25cfa30ca8d6c385132c53b6daac34e06c8a93bb03836"} Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.632451 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80" path="/var/lib/kubelet/pods/d3f4b6a8-0b9c-42a6-a4eb-77ee64a59a80/volumes" Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.811378 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-bd699b55c-ncb4d" podUID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.177:9696/\": dial tcp 10.217.0.177:9696: connect: connection refused" Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.855318 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.956331 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-combined-ca-bundle\") pod \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.956409 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-log-httpd\") pod \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.956737 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-scripts\") pod \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.956857 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spf6p\" (UniqueName: \"kubernetes.io/projected/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-kube-api-access-spf6p\") pod \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.956940 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-sg-core-conf-yaml\") pod \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.957001 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-config-data\") pod \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.957024 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-run-httpd\") pod \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\" (UID: \"5ce9dad6-2fa1-48f8-bd79-b114097ef3be\") " Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.957959 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5ce9dad6-2fa1-48f8-bd79-b114097ef3be" (UID: "5ce9dad6-2fa1-48f8-bd79-b114097ef3be"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.959381 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5ce9dad6-2fa1-48f8-bd79-b114097ef3be" (UID: "5ce9dad6-2fa1-48f8-bd79-b114097ef3be"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.961893 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-scripts" (OuterVolumeSpecName: "scripts") pod "5ce9dad6-2fa1-48f8-bd79-b114097ef3be" (UID: "5ce9dad6-2fa1-48f8-bd79-b114097ef3be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.962376 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-kube-api-access-spf6p" (OuterVolumeSpecName: "kube-api-access-spf6p") pod "5ce9dad6-2fa1-48f8-bd79-b114097ef3be" (UID: "5ce9dad6-2fa1-48f8-bd79-b114097ef3be"). InnerVolumeSpecName "kube-api-access-spf6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:00:58 crc kubenswrapper[4918]: I0319 17:00:58.991407 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54486b455c-k7jwz"] Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.008651 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5ce9dad6-2fa1-48f8-bd79-b114097ef3be" (UID: "5ce9dad6-2fa1-48f8-bd79-b114097ef3be"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.060402 4918 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.060447 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.060460 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spf6p\" (UniqueName: \"kubernetes.io/projected/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-kube-api-access-spf6p\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.060476 4918 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.060487 4918 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.070354 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ce9dad6-2fa1-48f8-bd79-b114097ef3be" (UID: "5ce9dad6-2fa1-48f8-bd79-b114097ef3be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.099937 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-config-data" (OuterVolumeSpecName: "config-data") pod "5ce9dad6-2fa1-48f8-bd79-b114097ef3be" (UID: "5ce9dad6-2fa1-48f8-bd79-b114097ef3be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.161712 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.162022 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce9dad6-2fa1-48f8-bd79-b114097ef3be-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.387561 4918 generic.go:334] "Generic (PLEG): container finished" podID="178a9ae2-1774-4025-8951-93167e95f5d7" containerID="abfc046338e9ac7084231b22015e5bd9c6862373c4e995d47fd5e77fb3d0cf70" exitCode=143 Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.387611 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"178a9ae2-1774-4025-8951-93167e95f5d7","Type":"ContainerDied","Data":"abfc046338e9ac7084231b22015e5bd9c6862373c4e995d47fd5e77fb3d0cf70"} Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.389411 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54486b455c-k7jwz" event={"ID":"f2211c01-104a-4847-8c59-bc11ff34169f","Type":"ContainerStarted","Data":"51c5a2036b52f226eb189a1d99b11076f5168f860be1e8f6115ed090f9bb3d38"} Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.389434 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54486b455c-k7jwz" event={"ID":"f2211c01-104a-4847-8c59-bc11ff34169f","Type":"ContainerStarted","Data":"d0453ece802887330249944fc6a8d5dbe82bf2ef8a8fc050b34676bd88339e49"} Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.398589 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ce9dad6-2fa1-48f8-bd79-b114097ef3be","Type":"ContainerDied","Data":"e721531ee49299413f095db8fee5d8290b032ebb86a864dda12f6c349aeeb03e"} Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.398652 4918 scope.go:117] "RemoveContainer" containerID="a01d3d35daa0cb9badd3c202d3b6dc2d71a3eb216dbf83b77e4282abdb743fb1" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.398812 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.419058 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" event={"ID":"0e6ac727-0f36-47d5-a77d-e21f590089e1","Type":"ContainerStarted","Data":"8c27781bffb3e420050d459671ec756792fda065c783ba1128771835271a0df9"} Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.419123 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" event={"ID":"0e6ac727-0f36-47d5-a77d-e21f590089e1","Type":"ContainerStarted","Data":"a4fc2a68066749c190256fb2c3656f23d2dba072007eabfeac97c5e68dc031c3"} Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.474101 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.510356 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.535172 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:00:59 crc kubenswrapper[4918]: E0319 17:00:59.535579 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="ceilometer-notification-agent" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.535596 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="ceilometer-notification-agent" Mar 19 17:00:59 crc kubenswrapper[4918]: E0319 17:00:59.535622 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="sg-core" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.535628 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="sg-core" Mar 19 17:00:59 crc kubenswrapper[4918]: E0319 17:00:59.535650 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="proxy-httpd" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.535656 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="proxy-httpd" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.535829 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="ceilometer-notification-agent" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.535839 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="proxy-httpd" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.535847 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" containerName="sg-core" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.537760 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.542552 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5c845594d6-ndb6n" podStartSLOduration=3.652583278 podStartE2EDuration="6.542517231s" podCreationTimestamp="2026-03-19 17:00:53 +0000 UTC" firstStartedPulling="2026-03-19 17:00:55.342651347 +0000 UTC m=+1267.464850595" lastFinishedPulling="2026-03-19 17:00:58.2325853 +0000 UTC m=+1270.354784548" observedRunningTime="2026-03-19 17:00:59.484569034 +0000 UTC m=+1271.606768282" watchObservedRunningTime="2026-03-19 17:00:59.542517231 +0000 UTC m=+1271.664716479" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.545752 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.545960 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.556757 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.672089 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-scripts\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.672598 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-log-httpd\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.672652 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.672780 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zqjc\" (UniqueName: \"kubernetes.io/projected/78d493df-532a-4203-9264-86e66bf964f0-kube-api-access-8zqjc\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.672884 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-config-data\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.672951 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-run-httpd\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.673265 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.775090 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-run-httpd\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.775235 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.775285 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-scripts\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.775344 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-log-httpd\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.775399 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.775432 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zqjc\" (UniqueName: \"kubernetes.io/projected/78d493df-532a-4203-9264-86e66bf964f0-kube-api-access-8zqjc\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.775466 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-config-data\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.775553 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-run-httpd\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.775841 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-log-httpd\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.780206 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.780555 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-config-data\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.781448 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-scripts\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.783835 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.793034 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zqjc\" (UniqueName: \"kubernetes.io/projected/78d493df-532a-4203-9264-86e66bf964f0-kube-api-access-8zqjc\") pod \"ceilometer-0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " pod="openstack/ceilometer-0" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.865042 4918 scope.go:117] "RemoveContainer" containerID="997a5f2cff248f4eab80aca914467bbd3c1604c4bce69e8ec57cffbc9a266a79" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.944708 4918 scope.go:117] "RemoveContainer" containerID="b9fc8758fc342d9ddd2492536ce75a17bb4e2d46aed7531a7ca328ee9aa7ab44" Mar 19 17:00:59 crc kubenswrapper[4918]: I0319 17:00:59.983259 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.134821 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29565661-7qtw4"] Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.136137 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.157236 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565661-7qtw4"] Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.285903 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-fernet-keys\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.286045 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-config-data\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.286111 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7ptf\" (UniqueName: \"kubernetes.io/projected/6d4f01d1-2728-4858-a39d-6b44e675aca5-kube-api-access-w7ptf\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.286165 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-combined-ca-bundle\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.388119 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-fernet-keys\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.388462 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-config-data\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.388522 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7ptf\" (UniqueName: \"kubernetes.io/projected/6d4f01d1-2728-4858-a39d-6b44e675aca5-kube-api-access-w7ptf\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.388593 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-combined-ca-bundle\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.391829 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-config-data\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.392705 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-combined-ca-bundle\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.398334 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-fernet-keys\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.416719 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7ptf\" (UniqueName: \"kubernetes.io/projected/6d4f01d1-2728-4858-a39d-6b44e675aca5-kube-api-access-w7ptf\") pod \"keystone-cron-29565661-7qtw4\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.446411 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6649d8ff59-m8z7j" event={"ID":"1552f8f6-8143-42f2-882b-acead175ae14","Type":"ContainerStarted","Data":"b7e32583d246998dc24c3153b0b7e43f5fa86b02a6b5161f2e0645085724943f"} Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.455478 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.524507 4918 generic.go:334] "Generic (PLEG): container finished" podID="84e41394-829e-4305-a4cf-0e35a37839a7" containerID="34b99b4915e9314909894d08437b2da10e33044f4dae5091a867bf7c7638986a" exitCode=0 Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.524657 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-x77xk" event={"ID":"84e41394-829e-4305-a4cf-0e35a37839a7","Type":"ContainerDied","Data":"34b99b4915e9314909894d08437b2da10e33044f4dae5091a867bf7c7638986a"} Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.530199 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23bad2c2-b869-49db-9a5b-9dc2ad887973","Type":"ContainerStarted","Data":"dc38e09dba0d954bea7077363b80b4145236cf72634a16a252e6cd11e21b8a57"} Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.535672 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54486b455c-k7jwz" event={"ID":"f2211c01-104a-4847-8c59-bc11ff34169f","Type":"ContainerStarted","Data":"5c4fe2c7712cc76cd86426a865bc2b06252ed60b56f4320556e9883cbb3ba9b5"} Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.536393 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.579850 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54486b455c-k7jwz" podStartSLOduration=4.579834636 podStartE2EDuration="4.579834636s" podCreationTimestamp="2026-03-19 17:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:00.567303863 +0000 UTC m=+1272.689503111" watchObservedRunningTime="2026-03-19 17:01:00.579834636 +0000 UTC m=+1272.702033874" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.651696 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ce9dad6-2fa1-48f8-bd79-b114097ef3be" path="/var/lib/kubelet/pods/5ce9dad6-2fa1-48f8-bd79-b114097ef3be/volumes" Mar 19 17:01:00 crc kubenswrapper[4918]: I0319 17:01:00.699663 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:00 crc kubenswrapper[4918]: W0319 17:01:00.779715 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78d493df_532a_4203_9264_86e66bf964f0.slice/crio-c70effcf5aac39751cda7b09a0403f7cee0a40998c7f6608e9eae8045c4df02b WatchSource:0}: Error finding container c70effcf5aac39751cda7b09a0403f7cee0a40998c7f6608e9eae8045c4df02b: Status 404 returned error can't find the container with id c70effcf5aac39751cda7b09a0403f7cee0a40998c7f6608e9eae8045c4df02b Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.085277 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565661-7qtw4"] Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.417582 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-678fb97f86-hlhbk"] Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.419326 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.424833 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.427149 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.457180 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-678fb97f86-hlhbk"] Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.518744 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af512bdc-58dc-481d-a454-821bcb84d090-logs\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.518870 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98szr\" (UniqueName: \"kubernetes.io/projected/af512bdc-58dc-481d-a454-821bcb84d090-kube-api-access-98szr\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.518910 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-config-data\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.518944 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-internal-tls-certs\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.518979 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-config-data-custom\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.519206 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-public-tls-certs\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.519339 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-combined-ca-bundle\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.566691 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565661-7qtw4" event={"ID":"6d4f01d1-2728-4858-a39d-6b44e675aca5","Type":"ContainerStarted","Data":"9e0fb5566c2d720b298bd6fde67d815fe3114d24dd3c95cc15ec2ad377f9d048"} Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.566747 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565661-7qtw4" event={"ID":"6d4f01d1-2728-4858-a39d-6b44e675aca5","Type":"ContainerStarted","Data":"2c70071f43906b0592e1dc113d4df09a9c4e71ea1a89b37da1cd8156e6282b6d"} Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.576467 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23bad2c2-b869-49db-9a5b-9dc2ad887973","Type":"ContainerStarted","Data":"371703b226ab0c29770c6348a64cbb1a277d7e09ef5ff2c9833674f0424c4554"} Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.583646 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6649d8ff59-m8z7j" event={"ID":"1552f8f6-8143-42f2-882b-acead175ae14","Type":"ContainerStarted","Data":"969383409a73565316028fc0ed29e3a185ac925172dc55c68fc78e58b629a9af"} Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.595457 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78d493df-532a-4203-9264-86e66bf964f0","Type":"ContainerStarted","Data":"c70effcf5aac39751cda7b09a0403f7cee0a40998c7f6608e9eae8045c4df02b"} Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.596920 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29565661-7qtw4" podStartSLOduration=1.596901476 podStartE2EDuration="1.596901476s" podCreationTimestamp="2026-03-19 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:01.581796743 +0000 UTC m=+1273.703996001" watchObservedRunningTime="2026-03-19 17:01:01.596901476 +0000 UTC m=+1273.719100724" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.619929 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.592836918 podStartE2EDuration="7.619904857s" podCreationTimestamp="2026-03-19 17:00:54 +0000 UTC" firstStartedPulling="2026-03-19 17:00:55.679828992 +0000 UTC m=+1267.802028240" lastFinishedPulling="2026-03-19 17:00:58.706896931 +0000 UTC m=+1270.829096179" observedRunningTime="2026-03-19 17:01:01.601446081 +0000 UTC m=+1273.723645329" watchObservedRunningTime="2026-03-19 17:01:01.619904857 +0000 UTC m=+1273.742104105" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.620729 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-public-tls-certs\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.620794 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-combined-ca-bundle\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.620852 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af512bdc-58dc-481d-a454-821bcb84d090-logs\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.620928 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98szr\" (UniqueName: \"kubernetes.io/projected/af512bdc-58dc-481d-a454-821bcb84d090-kube-api-access-98szr\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.620956 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-config-data\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.620975 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-internal-tls-certs\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.621008 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-config-data-custom\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.626892 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-public-tls-certs\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.626982 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af512bdc-58dc-481d-a454-821bcb84d090-logs\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.632832 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-config-data\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.637335 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-internal-tls-certs\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.639599 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-config-data-custom\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.640182 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af512bdc-58dc-481d-a454-821bcb84d090-combined-ca-bundle\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.657846 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6649d8ff59-m8z7j" podStartSLOduration=3.6719676100000003 podStartE2EDuration="8.657811155s" podCreationTimestamp="2026-03-19 17:00:53 +0000 UTC" firstStartedPulling="2026-03-19 17:00:54.963919192 +0000 UTC m=+1267.086118440" lastFinishedPulling="2026-03-19 17:00:59.949762737 +0000 UTC m=+1272.071961985" observedRunningTime="2026-03-19 17:01:01.651301476 +0000 UTC m=+1273.773500724" watchObservedRunningTime="2026-03-19 17:01:01.657811155 +0000 UTC m=+1273.780010393" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.660262 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98szr\" (UniqueName: \"kubernetes.io/projected/af512bdc-58dc-481d-a454-821bcb84d090-kube-api-access-98szr\") pod \"barbican-api-678fb97f86-hlhbk\" (UID: \"af512bdc-58dc-481d-a454-821bcb84d090\") " pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:01 crc kubenswrapper[4918]: I0319 17:01:01.741684 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.153249 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.253967 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-certs\") pod \"84e41394-829e-4305-a4cf-0e35a37839a7\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.254058 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8xgh\" (UniqueName: \"kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-kube-api-access-f8xgh\") pod \"84e41394-829e-4305-a4cf-0e35a37839a7\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.254089 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-scripts\") pod \"84e41394-829e-4305-a4cf-0e35a37839a7\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.254109 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-combined-ca-bundle\") pod \"84e41394-829e-4305-a4cf-0e35a37839a7\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.254130 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-config-data\") pod \"84e41394-829e-4305-a4cf-0e35a37839a7\" (UID: \"84e41394-829e-4305-a4cf-0e35a37839a7\") " Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.292826 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-scripts" (OuterVolumeSpecName: "scripts") pod "84e41394-829e-4305-a4cf-0e35a37839a7" (UID: "84e41394-829e-4305-a4cf-0e35a37839a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.293862 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-kube-api-access-f8xgh" (OuterVolumeSpecName: "kube-api-access-f8xgh") pod "84e41394-829e-4305-a4cf-0e35a37839a7" (UID: "84e41394-829e-4305-a4cf-0e35a37839a7"). InnerVolumeSpecName "kube-api-access-f8xgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.296187 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-certs" (OuterVolumeSpecName: "certs") pod "84e41394-829e-4305-a4cf-0e35a37839a7" (UID: "84e41394-829e-4305-a4cf-0e35a37839a7"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.298461 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-config-data" (OuterVolumeSpecName: "config-data") pod "84e41394-829e-4305-a4cf-0e35a37839a7" (UID: "84e41394-829e-4305-a4cf-0e35a37839a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.306809 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84e41394-829e-4305-a4cf-0e35a37839a7" (UID: "84e41394-829e-4305-a4cf-0e35a37839a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.358094 4918 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.358458 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8xgh\" (UniqueName: \"kubernetes.io/projected/84e41394-829e-4305-a4cf-0e35a37839a7-kube-api-access-f8xgh\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.358477 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.358490 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.358505 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e41394-829e-4305-a4cf-0e35a37839a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:02 crc kubenswrapper[4918]: W0319 17:01:02.547648 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf512bdc_58dc_481d_a454_821bcb84d090.slice/crio-b062cae839fbc865c7ba29f88395dfc23e79e9654120e7f2831dae39e009f84c WatchSource:0}: Error finding container b062cae839fbc865c7ba29f88395dfc23e79e9654120e7f2831dae39e009f84c: Status 404 returned error can't find the container with id b062cae839fbc865c7ba29f88395dfc23e79e9654120e7f2831dae39e009f84c Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.559859 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-678fb97f86-hlhbk"] Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.722038 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-x77xk" event={"ID":"84e41394-829e-4305-a4cf-0e35a37839a7","Type":"ContainerDied","Data":"c2749d59e38821272f4da2004ce9bd89a3c912dfa7183db570d12a63633f896a"} Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.722079 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2749d59e38821272f4da2004ce9bd89a3c912dfa7183db570d12a63633f896a" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.722169 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-x77xk" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.761757 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-678fb97f86-hlhbk" event={"ID":"af512bdc-58dc-481d-a454-821bcb84d090","Type":"ContainerStarted","Data":"b062cae839fbc865c7ba29f88395dfc23e79e9654120e7f2831dae39e009f84c"} Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.792129 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78d493df-532a-4203-9264-86e66bf964f0","Type":"ContainerStarted","Data":"d5c22a25ed50a8f736348edc2e40564c0f01803e56ab5de8d3a634c6669ce834"} Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.821776 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cntn9"] Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.822006 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" podUID="71b991d8-dce5-4482-90c2-b904a5f6eb0e" containerName="dnsmasq-dns" containerID="cri-o://29246a219f079b04c662a83e96f11fef94512295380083535a11df21ee713c1f" gracePeriod=10 Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.822883 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.891034 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:01:02 crc kubenswrapper[4918]: E0319 17:01:02.891496 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e41394-829e-4305-a4cf-0e35a37839a7" containerName="cloudkitty-storageinit" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.891511 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e41394-829e-4305-a4cf-0e35a37839a7" containerName="cloudkitty-storageinit" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.891735 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e41394-829e-4305-a4cf-0e35a37839a7" containerName="cloudkitty-storageinit" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.892442 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.904904 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.904978 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.905165 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-kxbs6" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.905281 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.905290 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.919113 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-l85q9"] Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.920944 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.927575 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.956987 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-l85q9"] Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.983624 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-certs\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.983697 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.983770 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.983859 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.983932 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.983986 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.984047 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.984088 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.984111 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdv88\" (UniqueName: \"kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-kube-api-access-tdv88\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.984170 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-svc\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.984243 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-config\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:02 crc kubenswrapper[4918]: I0319 17:01:02.984262 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlsms\" (UniqueName: \"kubernetes.io/projected/04163635-8e0b-4bf7-abf5-3504d0e391a8-kube-api-access-dlsms\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086051 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-config\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086097 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlsms\" (UniqueName: \"kubernetes.io/projected/04163635-8e0b-4bf7-abf5-3504d0e391a8-kube-api-access-dlsms\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086161 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-certs\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086178 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086199 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086228 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086256 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086286 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086311 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086326 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086346 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdv88\" (UniqueName: \"kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-kube-api-access-tdv88\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.086380 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-svc\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.087905 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-svc\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.089561 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-config\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.094596 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.094660 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-certs\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.096374 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.098590 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.099994 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.100550 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.101048 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.109848 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.113881 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.116915 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.136140 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.162150 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.188475 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.188886 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-scripts\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.188911 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.188971 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.189075 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44aad665-696e-4084-a457-309f4dd4c68d-logs\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.189097 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgzlx\" (UniqueName: \"kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-kube-api-access-hgzlx\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.189118 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-certs\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.204315 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlsms\" (UniqueName: \"kubernetes.io/projected/04163635-8e0b-4bf7-abf5-3504d0e391a8-kube-api-access-dlsms\") pod \"dnsmasq-dns-67bdc55879-l85q9\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.208067 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdv88\" (UniqueName: \"kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-kube-api-access-tdv88\") pod \"cloudkitty-proc-0\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.291883 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44aad665-696e-4084-a457-309f4dd4c68d-logs\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.291930 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-certs\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.291947 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgzlx\" (UniqueName: \"kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-kube-api-access-hgzlx\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.292009 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.292050 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-scripts\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.292067 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.292113 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.297396 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44aad665-696e-4084-a457-309f4dd4c68d-logs\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.300350 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-scripts\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.303187 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.304229 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.304710 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-certs\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.316241 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.317962 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.344920 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.345309 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgzlx\" (UniqueName: \"kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-kube-api-access-hgzlx\") pod \"cloudkitty-api-0\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.471969 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.863027 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-678fb97f86-hlhbk" event={"ID":"af512bdc-58dc-481d-a454-821bcb84d090","Type":"ContainerStarted","Data":"8f4878d88a8cc7b803e9b8e67e5c0d3da03139eff8927fdddacbe7f2eefa0216"} Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.884622 4918 generic.go:334] "Generic (PLEG): container finished" podID="71b991d8-dce5-4482-90c2-b904a5f6eb0e" containerID="29246a219f079b04c662a83e96f11fef94512295380083535a11df21ee713c1f" exitCode=0 Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.884679 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" event={"ID":"71b991d8-dce5-4482-90c2-b904a5f6eb0e","Type":"ContainerDied","Data":"29246a219f079b04c662a83e96f11fef94512295380083535a11df21ee713c1f"} Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.884705 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" event={"ID":"71b991d8-dce5-4482-90c2-b904a5f6eb0e","Type":"ContainerDied","Data":"5ee5d73dcf5747ea0d10a83dd4185eda321f35d42b5feecffa6a78c342114ddc"} Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.884718 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ee5d73dcf5747ea0d10a83dd4185eda321f35d42b5feecffa6a78c342114ddc" Mar 19 17:01:03 crc kubenswrapper[4918]: I0319 17:01:03.886172 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78d493df-532a-4203-9264-86e66bf964f0","Type":"ContainerStarted","Data":"ec9fe3e21f4df9bdfaa42364eed81825c8d6dc7a894fa20c56160687357261c2"} Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.093360 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.096217 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.111049 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-svc\") pod \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.111375 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dddwb\" (UniqueName: \"kubernetes.io/projected/71b991d8-dce5-4482-90c2-b904a5f6eb0e-kube-api-access-dddwb\") pod \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.111396 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-sb\") pod \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.111416 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-nb\") pod \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.111459 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-config\") pod \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.111478 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-swift-storage-0\") pod \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\" (UID: \"71b991d8-dce5-4482-90c2-b904a5f6eb0e\") " Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.211543 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b991d8-dce5-4482-90c2-b904a5f6eb0e-kube-api-access-dddwb" (OuterVolumeSpecName: "kube-api-access-dddwb") pod "71b991d8-dce5-4482-90c2-b904a5f6eb0e" (UID: "71b991d8-dce5-4482-90c2-b904a5f6eb0e"). InnerVolumeSpecName "kube-api-access-dddwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.219414 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dddwb\" (UniqueName: \"kubernetes.io/projected/71b991d8-dce5-4482-90c2-b904a5f6eb0e-kube-api-access-dddwb\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.327903 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-l85q9"] Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.354392 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "71b991d8-dce5-4482-90c2-b904a5f6eb0e" (UID: "71b991d8-dce5-4482-90c2-b904a5f6eb0e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.424279 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.448691 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.508609 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "71b991d8-dce5-4482-90c2-b904a5f6eb0e" (UID: "71b991d8-dce5-4482-90c2-b904a5f6eb0e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.526306 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.528055 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "71b991d8-dce5-4482-90c2-b904a5f6eb0e" (UID: "71b991d8-dce5-4482-90c2-b904a5f6eb0e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.560361 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "71b991d8-dce5-4482-90c2-b904a5f6eb0e" (UID: "71b991d8-dce5-4482-90c2-b904a5f6eb0e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.568004 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-config" (OuterVolumeSpecName: "config") pod "71b991d8-dce5-4482-90c2-b904a5f6eb0e" (UID: "71b991d8-dce5-4482-90c2-b904a5f6eb0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.631604 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.631671 4918 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.631688 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71b991d8-dce5-4482-90c2-b904a5f6eb0e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.705744 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.963316 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" event={"ID":"04163635-8e0b-4bf7-abf5-3504d0e391a8","Type":"ContainerStarted","Data":"aca1fac95bd48f10c168b69e293e5b406f71eb72d9b1b0b22c1b0f3e325e2817"} Mar 19 17:01:04 crc kubenswrapper[4918]: I0319 17:01:04.963620 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" event={"ID":"04163635-8e0b-4bf7-abf5-3504d0e391a8","Type":"ContainerStarted","Data":"149c32acb9b9918f417eb460413778840c8e230113e72b6bd42202f1620b25bf"} Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.003992 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"44aad665-696e-4084-a457-309f4dd4c68d","Type":"ContainerStarted","Data":"4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0"} Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.004039 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"44aad665-696e-4084-a457-309f4dd4c68d","Type":"ContainerStarted","Data":"7de738613db636673c3e8a264eaeda9ffa2c910972a998c35b16fc51a4de93bf"} Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.026997 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8de72dd4-40e4-4d8e-b820-383e3c8e3734","Type":"ContainerStarted","Data":"a877a24105b8c7861df2ef57719149f3bc4b21c943b05e1abc05be1e48cd29c0"} Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.047478 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-678fb97f86-hlhbk" event={"ID":"af512bdc-58dc-481d-a454-821bcb84d090","Type":"ContainerStarted","Data":"a09fb9dc40cd69a33561b76026783f0b33e4f1a38d69da6eddebbd1080e1afba"} Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.048343 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.048381 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.067327 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cntn9" Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.068152 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78d493df-532a-4203-9264-86e66bf964f0","Type":"ContainerStarted","Data":"09cf7a30a722680444026776007f6b7f843254948f2473f31360c0f51c8d7dc4"} Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.109912 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-678fb97f86-hlhbk" podStartSLOduration=4.109889936 podStartE2EDuration="4.109889936s" podCreationTimestamp="2026-03-19 17:01:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:05.087197304 +0000 UTC m=+1277.209396552" watchObservedRunningTime="2026-03-19 17:01:05.109889936 +0000 UTC m=+1277.232089184" Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.120450 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="23bad2c2-b869-49db-9a5b-9dc2ad887973" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.199640 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cntn9"] Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.219219 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cntn9"] Mar 19 17:01:05 crc kubenswrapper[4918]: I0319 17:01:05.475770 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-846b889554-z7r6b" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.183:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 17:01:06 crc kubenswrapper[4918]: I0319 17:01:06.091294 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"44aad665-696e-4084-a457-309f4dd4c68d","Type":"ContainerStarted","Data":"4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083"} Mar 19 17:01:06 crc kubenswrapper[4918]: I0319 17:01:06.092705 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 19 17:01:06 crc kubenswrapper[4918]: I0319 17:01:06.107892 4918 generic.go:334] "Generic (PLEG): container finished" podID="04163635-8e0b-4bf7-abf5-3504d0e391a8" containerID="aca1fac95bd48f10c168b69e293e5b406f71eb72d9b1b0b22c1b0f3e325e2817" exitCode=0 Mar 19 17:01:06 crc kubenswrapper[4918]: I0319 17:01:06.108948 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" event={"ID":"04163635-8e0b-4bf7-abf5-3504d0e391a8","Type":"ContainerDied","Data":"aca1fac95bd48f10c168b69e293e5b406f71eb72d9b1b0b22c1b0f3e325e2817"} Mar 19 17:01:06 crc kubenswrapper[4918]: I0319 17:01:06.108984 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" event={"ID":"04163635-8e0b-4bf7-abf5-3504d0e391a8","Type":"ContainerStarted","Data":"4c45522f586492cd2568e2d7c8ce8793fed76539da03d3301382bcf2a5feb13e"} Mar 19 17:01:06 crc kubenswrapper[4918]: I0319 17:01:06.109037 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:06 crc kubenswrapper[4918]: I0319 17:01:06.128911 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.128889949 podStartE2EDuration="3.128889949s" podCreationTimestamp="2026-03-19 17:01:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:06.11797659 +0000 UTC m=+1278.240175848" watchObservedRunningTime="2026-03-19 17:01:06.128889949 +0000 UTC m=+1278.251089197" Mar 19 17:01:06 crc kubenswrapper[4918]: I0319 17:01:06.165755 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" podStartSLOduration=4.165729999 podStartE2EDuration="4.165729999s" podCreationTimestamp="2026-03-19 17:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:06.152618439 +0000 UTC m=+1278.274817687" watchObservedRunningTime="2026-03-19 17:01:06.165729999 +0000 UTC m=+1278.287929247" Mar 19 17:01:06 crc kubenswrapper[4918]: I0319 17:01:06.473068 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:01:06 crc kubenswrapper[4918]: I0319 17:01:06.596127 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b991d8-dce5-4482-90c2-b904a5f6eb0e" path="/var/lib/kubelet/pods/71b991d8-dce5-4482-90c2-b904a5f6eb0e/volumes" Mar 19 17:01:06 crc kubenswrapper[4918]: I0319 17:01:06.803269 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:01:07 crc kubenswrapper[4918]: I0319 17:01:07.140038 4918 generic.go:334] "Generic (PLEG): container finished" podID="6d4f01d1-2728-4858-a39d-6b44e675aca5" containerID="9e0fb5566c2d720b298bd6fde67d815fe3114d24dd3c95cc15ec2ad377f9d048" exitCode=0 Mar 19 17:01:07 crc kubenswrapper[4918]: I0319 17:01:07.140131 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565661-7qtw4" event={"ID":"6d4f01d1-2728-4858-a39d-6b44e675aca5","Type":"ContainerDied","Data":"9e0fb5566c2d720b298bd6fde67d815fe3114d24dd3c95cc15ec2ad377f9d048"} Mar 19 17:01:07 crc kubenswrapper[4918]: I0319 17:01:07.142749 4918 generic.go:334] "Generic (PLEG): container finished" podID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerID="e0e61af886f8be875324ab3e9decf415ea43b9983c91df92048cb302351053e0" exitCode=0 Mar 19 17:01:07 crc kubenswrapper[4918]: I0319 17:01:07.142812 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd699b55c-ncb4d" event={"ID":"9ab5151a-6a64-47a2-8e0b-47455e4f66b0","Type":"ContainerDied","Data":"e0e61af886f8be875324ab3e9decf415ea43b9983c91df92048cb302351053e0"} Mar 19 17:01:07 crc kubenswrapper[4918]: I0319 17:01:07.365224 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.223017 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78d493df-532a-4203-9264-86e66bf964f0","Type":"ContainerStarted","Data":"738ad7ddff566c1bcf4fae1c9adc62034094f6f32e93bf3022b7ccefa9f282b0"} Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.225320 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.227762 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8de72dd4-40e4-4d8e-b820-383e3c8e3734","Type":"ContainerStarted","Data":"9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2"} Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.227896 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="44aad665-696e-4084-a457-309f4dd4c68d" containerName="cloudkitty-api-log" containerID="cri-o://4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0" gracePeriod=30 Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.228105 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="44aad665-696e-4084-a457-309f4dd4c68d" containerName="cloudkitty-api" containerID="cri-o://4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083" gracePeriod=30 Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.262222 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2865867570000002 podStartE2EDuration="9.262200776s" podCreationTimestamp="2026-03-19 17:00:59 +0000 UTC" firstStartedPulling="2026-03-19 17:01:00.791066153 +0000 UTC m=+1272.913265411" lastFinishedPulling="2026-03-19 17:01:07.766680182 +0000 UTC m=+1279.888879430" observedRunningTime="2026-03-19 17:01:08.258269188 +0000 UTC m=+1280.380468436" watchObservedRunningTime="2026-03-19 17:01:08.262200776 +0000 UTC m=+1280.384400014" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.308761 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.716682025 podStartE2EDuration="6.30874098s" podCreationTimestamp="2026-03-19 17:01:02 +0000 UTC" firstStartedPulling="2026-03-19 17:01:04.102051859 +0000 UTC m=+1276.224251097" lastFinishedPulling="2026-03-19 17:01:07.694110804 +0000 UTC m=+1279.816310052" observedRunningTime="2026-03-19 17:01:08.298441359 +0000 UTC m=+1280.420640597" watchObservedRunningTime="2026-03-19 17:01:08.30874098 +0000 UTC m=+1280.430940228" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.321401 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.498321 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.574048 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vb8f\" (UniqueName: \"kubernetes.io/projected/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-kube-api-access-4vb8f\") pod \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.574330 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-ovndb-tls-certs\") pod \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.574378 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-config\") pod \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.574414 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-combined-ca-bundle\") pod \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.574509 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-httpd-config\") pod \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.592341 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-public-tls-certs\") pod \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.592424 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-internal-tls-certs\") pod \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\" (UID: \"9ab5151a-6a64-47a2-8e0b-47455e4f66b0\") " Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.634734 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-kube-api-access-4vb8f" (OuterVolumeSpecName: "kube-api-access-4vb8f") pod "9ab5151a-6a64-47a2-8e0b-47455e4f66b0" (UID: "9ab5151a-6a64-47a2-8e0b-47455e4f66b0"). InnerVolumeSpecName "kube-api-access-4vb8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.646603 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9ab5151a-6a64-47a2-8e0b-47455e4f66b0" (UID: "9ab5151a-6a64-47a2-8e0b-47455e4f66b0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.696176 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vb8f\" (UniqueName: \"kubernetes.io/projected/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-kube-api-access-4vb8f\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.696203 4918 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.745293 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ab5151a-6a64-47a2-8e0b-47455e4f66b0" (UID: "9ab5151a-6a64-47a2-8e0b-47455e4f66b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.793673 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9ab5151a-6a64-47a2-8e0b-47455e4f66b0" (UID: "9ab5151a-6a64-47a2-8e0b-47455e4f66b0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.801849 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-config" (OuterVolumeSpecName: "config") pod "9ab5151a-6a64-47a2-8e0b-47455e4f66b0" (UID: "9ab5151a-6a64-47a2-8e0b-47455e4f66b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.824455 4918 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.824480 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.829890 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9ab5151a-6a64-47a2-8e0b-47455e4f66b0" (UID: "9ab5151a-6a64-47a2-8e0b-47455e4f66b0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.832808 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "9ab5151a-6a64-47a2-8e0b-47455e4f66b0" (UID: "9ab5151a-6a64-47a2-8e0b-47455e4f66b0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.897804 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.926323 4918 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.926357 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:08 crc kubenswrapper[4918]: I0319 17:01:08.926367 4918 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ab5151a-6a64-47a2-8e0b-47455e4f66b0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.026736 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-config-data\") pod \"6d4f01d1-2728-4858-a39d-6b44e675aca5\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.026841 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7ptf\" (UniqueName: \"kubernetes.io/projected/6d4f01d1-2728-4858-a39d-6b44e675aca5-kube-api-access-w7ptf\") pod \"6d4f01d1-2728-4858-a39d-6b44e675aca5\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.026862 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-fernet-keys\") pod \"6d4f01d1-2728-4858-a39d-6b44e675aca5\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.026941 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-combined-ca-bundle\") pod \"6d4f01d1-2728-4858-a39d-6b44e675aca5\" (UID: \"6d4f01d1-2728-4858-a39d-6b44e675aca5\") " Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.031948 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6d4f01d1-2728-4858-a39d-6b44e675aca5" (UID: "6d4f01d1-2728-4858-a39d-6b44e675aca5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.034771 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d4f01d1-2728-4858-a39d-6b44e675aca5-kube-api-access-w7ptf" (OuterVolumeSpecName: "kube-api-access-w7ptf") pod "6d4f01d1-2728-4858-a39d-6b44e675aca5" (UID: "6d4f01d1-2728-4858-a39d-6b44e675aca5"). InnerVolumeSpecName "kube-api-access-w7ptf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.062024 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d4f01d1-2728-4858-a39d-6b44e675aca5" (UID: "6d4f01d1-2728-4858-a39d-6b44e675aca5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.089720 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-config-data" (OuterVolumeSpecName: "config-data") pod "6d4f01d1-2728-4858-a39d-6b44e675aca5" (UID: "6d4f01d1-2728-4858-a39d-6b44e675aca5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.129731 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.129768 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7ptf\" (UniqueName: \"kubernetes.io/projected/6d4f01d1-2728-4858-a39d-6b44e675aca5-kube-api-access-w7ptf\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.129782 4918 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.129791 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4f01d1-2728-4858-a39d-6b44e675aca5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.248167 4918 generic.go:334] "Generic (PLEG): container finished" podID="44aad665-696e-4084-a457-309f4dd4c68d" containerID="4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0" exitCode=143 Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.248255 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"44aad665-696e-4084-a457-309f4dd4c68d","Type":"ContainerDied","Data":"4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0"} Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.253496 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565661-7qtw4" event={"ID":"6d4f01d1-2728-4858-a39d-6b44e675aca5","Type":"ContainerDied","Data":"2c70071f43906b0592e1dc113d4df09a9c4e71ea1a89b37da1cd8156e6282b6d"} Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.253536 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c70071f43906b0592e1dc113d4df09a9c4e71ea1a89b37da1cd8156e6282b6d" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.253592 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565661-7qtw4" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.275958 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd699b55c-ncb4d" event={"ID":"9ab5151a-6a64-47a2-8e0b-47455e4f66b0","Type":"ContainerDied","Data":"86444fbf288072cbc7605672ab34c69a514a468427d17d4440d9af0adb1909ed"} Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.276011 4918 scope.go:117] "RemoveContainer" containerID="a3edc838142999c283b25cfa30ca8d6c385132c53b6daac34e06c8a93bb03836" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.276176 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd699b55c-ncb4d" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.316864 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bd699b55c-ncb4d"] Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.317165 4918 scope.go:117] "RemoveContainer" containerID="e0e61af886f8be875324ab3e9decf415ea43b9983c91df92048cb302351053e0" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.352842 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bd699b55c-ncb4d"] Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.539305 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.581895 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.614456 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.714418 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 17:01:09 crc kubenswrapper[4918]: I0319 17:01:09.827704 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.284486 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.287903 4918 generic.go:334] "Generic (PLEG): container finished" podID="44aad665-696e-4084-a457-309f4dd4c68d" containerID="4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083" exitCode=0 Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.287990 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.288059 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"44aad665-696e-4084-a457-309f4dd4c68d","Type":"ContainerDied","Data":"4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083"} Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.288092 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"44aad665-696e-4084-a457-309f4dd4c68d","Type":"ContainerDied","Data":"7de738613db636673c3e8a264eaeda9ffa2c910972a998c35b16fc51a4de93bf"} Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.288108 4918 scope.go:117] "RemoveContainer" containerID="4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.288597 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="23bad2c2-b869-49db-9a5b-9dc2ad887973" containerName="cinder-scheduler" containerID="cri-o://dc38e09dba0d954bea7077363b80b4145236cf72634a16a252e6cd11e21b8a57" gracePeriod=30 Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.288634 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="23bad2c2-b869-49db-9a5b-9dc2ad887973" containerName="probe" containerID="cri-o://371703b226ab0c29770c6348a64cbb1a277d7e09ef5ff2c9833674f0424c4554" gracePeriod=30 Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.288834 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="8de72dd4-40e4-4d8e-b820-383e3c8e3734" containerName="cloudkitty-proc" containerID="cri-o://9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2" gracePeriod=30 Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.311557 4918 scope.go:117] "RemoveContainer" containerID="4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.353032 4918 scope.go:117] "RemoveContainer" containerID="4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083" Mar 19 17:01:10 crc kubenswrapper[4918]: E0319 17:01:10.353607 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083\": container with ID starting with 4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083 not found: ID does not exist" containerID="4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.353646 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083"} err="failed to get container status \"4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083\": rpc error: code = NotFound desc = could not find container \"4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083\": container with ID starting with 4928b99933bd7627e1128ca65b2b4c1ee656d4a4c49bd0d4f9b24708af4f6083 not found: ID does not exist" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.353676 4918 scope.go:117] "RemoveContainer" containerID="4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0" Mar 19 17:01:10 crc kubenswrapper[4918]: E0319 17:01:10.353981 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0\": container with ID starting with 4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0 not found: ID does not exist" containerID="4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.354001 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0"} err="failed to get container status \"4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0\": rpc error: code = NotFound desc = could not find container \"4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0\": container with ID starting with 4c4273ce082d17d7ccd707f33e6bba717153015203a3926665865dbab8b8b0b0 not found: ID does not exist" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.465199 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-combined-ca-bundle\") pod \"44aad665-696e-4084-a457-309f4dd4c68d\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.465396 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44aad665-696e-4084-a457-309f4dd4c68d-logs\") pod \"44aad665-696e-4084-a457-309f4dd4c68d\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.465441 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-scripts\") pod \"44aad665-696e-4084-a457-309f4dd4c68d\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.465512 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data-custom\") pod \"44aad665-696e-4084-a457-309f4dd4c68d\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.465553 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data\") pod \"44aad665-696e-4084-a457-309f4dd4c68d\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.465604 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgzlx\" (UniqueName: \"kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-kube-api-access-hgzlx\") pod \"44aad665-696e-4084-a457-309f4dd4c68d\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.465749 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-certs\") pod \"44aad665-696e-4084-a457-309f4dd4c68d\" (UID: \"44aad665-696e-4084-a457-309f4dd4c68d\") " Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.466148 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44aad665-696e-4084-a457-309f4dd4c68d-logs" (OuterVolumeSpecName: "logs") pod "44aad665-696e-4084-a457-309f4dd4c68d" (UID: "44aad665-696e-4084-a457-309f4dd4c68d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.466657 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44aad665-696e-4084-a457-309f4dd4c68d-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.472403 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-scripts" (OuterVolumeSpecName: "scripts") pod "44aad665-696e-4084-a457-309f4dd4c68d" (UID: "44aad665-696e-4084-a457-309f4dd4c68d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.472583 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-certs" (OuterVolumeSpecName: "certs") pod "44aad665-696e-4084-a457-309f4dd4c68d" (UID: "44aad665-696e-4084-a457-309f4dd4c68d"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.487745 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "44aad665-696e-4084-a457-309f4dd4c68d" (UID: "44aad665-696e-4084-a457-309f4dd4c68d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.487767 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-kube-api-access-hgzlx" (OuterVolumeSpecName: "kube-api-access-hgzlx") pod "44aad665-696e-4084-a457-309f4dd4c68d" (UID: "44aad665-696e-4084-a457-309f4dd4c68d"). InnerVolumeSpecName "kube-api-access-hgzlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.499281 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44aad665-696e-4084-a457-309f4dd4c68d" (UID: "44aad665-696e-4084-a457-309f4dd4c68d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.506480 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data" (OuterVolumeSpecName: "config-data") pod "44aad665-696e-4084-a457-309f4dd4c68d" (UID: "44aad665-696e-4084-a457-309f4dd4c68d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.568154 4918 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.568564 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.568672 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.568772 4918 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.568852 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44aad665-696e-4084-a457-309f4dd4c68d-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.568940 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgzlx\" (UniqueName: \"kubernetes.io/projected/44aad665-696e-4084-a457-309f4dd4c68d-kube-api-access-hgzlx\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.605292 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" path="/var/lib/kubelet/pods/9ab5151a-6a64-47a2-8e0b-47455e4f66b0/volumes" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.632749 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.642370 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.657769 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:01:10 crc kubenswrapper[4918]: E0319 17:01:10.658354 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b991d8-dce5-4482-90c2-b904a5f6eb0e" containerName="dnsmasq-dns" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.658430 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b991d8-dce5-4482-90c2-b904a5f6eb0e" containerName="dnsmasq-dns" Mar 19 17:01:10 crc kubenswrapper[4918]: E0319 17:01:10.658493 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b991d8-dce5-4482-90c2-b904a5f6eb0e" containerName="init" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.658572 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b991d8-dce5-4482-90c2-b904a5f6eb0e" containerName="init" Mar 19 17:01:10 crc kubenswrapper[4918]: E0319 17:01:10.658637 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerName="neutron-api" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.658693 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerName="neutron-api" Mar 19 17:01:10 crc kubenswrapper[4918]: E0319 17:01:10.658755 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44aad665-696e-4084-a457-309f4dd4c68d" containerName="cloudkitty-api" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.658812 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="44aad665-696e-4084-a457-309f4dd4c68d" containerName="cloudkitty-api" Mar 19 17:01:10 crc kubenswrapper[4918]: E0319 17:01:10.658872 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d4f01d1-2728-4858-a39d-6b44e675aca5" containerName="keystone-cron" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.658928 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4f01d1-2728-4858-a39d-6b44e675aca5" containerName="keystone-cron" Mar 19 17:01:10 crc kubenswrapper[4918]: E0319 17:01:10.660076 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44aad665-696e-4084-a457-309f4dd4c68d" containerName="cloudkitty-api-log" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.660151 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="44aad665-696e-4084-a457-309f4dd4c68d" containerName="cloudkitty-api-log" Mar 19 17:01:10 crc kubenswrapper[4918]: E0319 17:01:10.660219 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerName="neutron-httpd" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.660272 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerName="neutron-httpd" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.660554 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerName="neutron-httpd" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.660622 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="44aad665-696e-4084-a457-309f4dd4c68d" containerName="cloudkitty-api-log" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.660681 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab5151a-6a64-47a2-8e0b-47455e4f66b0" containerName="neutron-api" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.660746 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d4f01d1-2728-4858-a39d-6b44e675aca5" containerName="keystone-cron" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.660813 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b991d8-dce5-4482-90c2-b904a5f6eb0e" containerName="dnsmasq-dns" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.660875 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="44aad665-696e-4084-a457-309f4dd4c68d" containerName="cloudkitty-api" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.661994 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.664418 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.664672 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.667601 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.669488 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.669617 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.669653 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9jjh\" (UniqueName: \"kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-kube-api-access-x9jjh\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.669679 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.669708 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.669738 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-scripts\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.669783 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-logs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.669813 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-certs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.669836 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.675502 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.772899 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.772969 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.773010 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-scripts\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.773046 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-logs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.773071 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-certs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.773100 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.773185 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.773214 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.773248 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9jjh\" (UniqueName: \"kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-kube-api-access-x9jjh\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.774092 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-logs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.778033 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.781679 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.781868 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.784277 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.785092 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.785476 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-certs\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.787045 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-scripts\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:10 crc kubenswrapper[4918]: I0319 17:01:10.796047 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9jjh\" (UniqueName: \"kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-kube-api-access-x9jjh\") pod \"cloudkitty-api-0\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " pod="openstack/cloudkitty-api-0" Mar 19 17:01:11 crc kubenswrapper[4918]: I0319 17:01:11.010361 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 17:01:11 crc kubenswrapper[4918]: I0319 17:01:11.685707 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.492205 4918 generic.go:334] "Generic (PLEG): container finished" podID="23bad2c2-b869-49db-9a5b-9dc2ad887973" containerID="371703b226ab0c29770c6348a64cbb1a277d7e09ef5ff2c9833674f0424c4554" exitCode=0 Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.492521 4918 generic.go:334] "Generic (PLEG): container finished" podID="23bad2c2-b869-49db-9a5b-9dc2ad887973" containerID="dc38e09dba0d954bea7077363b80b4145236cf72634a16a252e6cd11e21b8a57" exitCode=0 Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.492266 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23bad2c2-b869-49db-9a5b-9dc2ad887973","Type":"ContainerDied","Data":"371703b226ab0c29770c6348a64cbb1a277d7e09ef5ff2c9833674f0424c4554"} Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.492634 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23bad2c2-b869-49db-9a5b-9dc2ad887973","Type":"ContainerDied","Data":"dc38e09dba0d954bea7077363b80b4145236cf72634a16a252e6cd11e21b8a57"} Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.510373 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f","Type":"ContainerStarted","Data":"aaeca65a1801fb6477b370ed8514a87bf222d85460413a9055d3fa78c502c3d2"} Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.510415 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f","Type":"ContainerStarted","Data":"e5e4aff2621423e71599e294893a770435002224f4a12d3fa4f54e0a577eb764"} Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.646328 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44aad665-696e-4084-a457-309f4dd4c68d" path="/var/lib/kubelet/pods/44aad665-696e-4084-a457-309f4dd4c68d/volumes" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.673047 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.757203 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8prb\" (UniqueName: \"kubernetes.io/projected/23bad2c2-b869-49db-9a5b-9dc2ad887973-kube-api-access-m8prb\") pod \"23bad2c2-b869-49db-9a5b-9dc2ad887973\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.758440 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-scripts\") pod \"23bad2c2-b869-49db-9a5b-9dc2ad887973\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.758603 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data\") pod \"23bad2c2-b869-49db-9a5b-9dc2ad887973\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.758937 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-combined-ca-bundle\") pod \"23bad2c2-b869-49db-9a5b-9dc2ad887973\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.759048 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data-custom\") pod \"23bad2c2-b869-49db-9a5b-9dc2ad887973\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.759127 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23bad2c2-b869-49db-9a5b-9dc2ad887973-etc-machine-id\") pod \"23bad2c2-b869-49db-9a5b-9dc2ad887973\" (UID: \"23bad2c2-b869-49db-9a5b-9dc2ad887973\") " Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.759981 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23bad2c2-b869-49db-9a5b-9dc2ad887973-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "23bad2c2-b869-49db-9a5b-9dc2ad887973" (UID: "23bad2c2-b869-49db-9a5b-9dc2ad887973"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.786825 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "23bad2c2-b869-49db-9a5b-9dc2ad887973" (UID: "23bad2c2-b869-49db-9a5b-9dc2ad887973"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.790676 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-scripts" (OuterVolumeSpecName: "scripts") pod "23bad2c2-b869-49db-9a5b-9dc2ad887973" (UID: "23bad2c2-b869-49db-9a5b-9dc2ad887973"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.794908 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23bad2c2-b869-49db-9a5b-9dc2ad887973-kube-api-access-m8prb" (OuterVolumeSpecName: "kube-api-access-m8prb") pod "23bad2c2-b869-49db-9a5b-9dc2ad887973" (UID: "23bad2c2-b869-49db-9a5b-9dc2ad887973"). InnerVolumeSpecName "kube-api-access-m8prb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.854557 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23bad2c2-b869-49db-9a5b-9dc2ad887973" (UID: "23bad2c2-b869-49db-9a5b-9dc2ad887973"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.862120 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.862155 4918 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.862166 4918 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23bad2c2-b869-49db-9a5b-9dc2ad887973-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.862180 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8prb\" (UniqueName: \"kubernetes.io/projected/23bad2c2-b869-49db-9a5b-9dc2ad887973-kube-api-access-m8prb\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.862197 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.961947 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data" (OuterVolumeSpecName: "config-data") pod "23bad2c2-b869-49db-9a5b-9dc2ad887973" (UID: "23bad2c2-b869-49db-9a5b-9dc2ad887973"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:12 crc kubenswrapper[4918]: I0319 17:01:12.965337 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23bad2c2-b869-49db-9a5b-9dc2ad887973-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.150212 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.347785 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.423951 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-frrgq"] Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.424334 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" podUID="46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" containerName="dnsmasq-dns" containerID="cri-o://50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f" gracePeriod=10 Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.541102 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"23bad2c2-b869-49db-9a5b-9dc2ad887973","Type":"ContainerDied","Data":"0ae331a3b7e514c3aee96909b2803ecc7e3bcd75ad542b444b37c5559f0a4497"} Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.541632 4918 scope.go:117] "RemoveContainer" containerID="371703b226ab0c29770c6348a64cbb1a277d7e09ef5ff2c9833674f0424c4554" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.541663 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.543550 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.570915 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f","Type":"ContainerStarted","Data":"3fe92d05181a2da3a3dab7268975c01c3cfb08b09628954b39e89f2e0631ba2a"} Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.571994 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.606126 4918 scope.go:117] "RemoveContainer" containerID="dc38e09dba0d954bea7077363b80b4145236cf72634a16a252e6cd11e21b8a57" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.639608 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.639572715 podStartE2EDuration="3.639572715s" podCreationTimestamp="2026-03-19 17:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:13.635402261 +0000 UTC m=+1285.757601509" watchObservedRunningTime="2026-03-19 17:01:13.639572715 +0000 UTC m=+1285.761771963" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.701585 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.729830 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.746655 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 17:01:13 crc kubenswrapper[4918]: E0319 17:01:13.747202 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23bad2c2-b869-49db-9a5b-9dc2ad887973" containerName="probe" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.747267 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="23bad2c2-b869-49db-9a5b-9dc2ad887973" containerName="probe" Mar 19 17:01:13 crc kubenswrapper[4918]: E0319 17:01:13.747340 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23bad2c2-b869-49db-9a5b-9dc2ad887973" containerName="cinder-scheduler" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.747401 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="23bad2c2-b869-49db-9a5b-9dc2ad887973" containerName="cinder-scheduler" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.747661 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="23bad2c2-b869-49db-9a5b-9dc2ad887973" containerName="probe" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.747722 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="23bad2c2-b869-49db-9a5b-9dc2ad887973" containerName="cinder-scheduler" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.753946 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.774267 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.774466 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.813396 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqzzg\" (UniqueName: \"kubernetes.io/projected/c7303214-658d-4763-a3d6-cffd5025d9d4-kube-api-access-bqzzg\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.813549 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-scripts\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.813666 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.813738 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-config-data\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.813816 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7303214-658d-4763-a3d6-cffd5025d9d4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.813885 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.872649 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76f5474f44-brjsr"] Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.874770 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.904735 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76f5474f44-brjsr"] Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915375 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-config-data\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915440 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7303214-658d-4763-a3d6-cffd5025d9d4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915481 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-config-data\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915544 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-combined-ca-bundle\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915572 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915604 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-public-tls-certs\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915651 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tppbb\" (UniqueName: \"kubernetes.io/projected/79a5f53b-d94a-405f-9ead-9d519f30a3dc-kube-api-access-tppbb\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915677 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a5f53b-d94a-405f-9ead-9d519f30a3dc-logs\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915698 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-scripts\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915731 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqzzg\" (UniqueName: \"kubernetes.io/projected/c7303214-658d-4763-a3d6-cffd5025d9d4-kube-api-access-bqzzg\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915788 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-internal-tls-certs\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915825 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-scripts\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.915903 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.916412 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7303214-658d-4763-a3d6-cffd5025d9d4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.947075 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqzzg\" (UniqueName: \"kubernetes.io/projected/c7303214-658d-4763-a3d6-cffd5025d9d4-kube-api-access-bqzzg\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.964718 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.965204 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-scripts\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.966033 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-config-data\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:13 crc kubenswrapper[4918]: I0319 17:01:13.979058 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7303214-658d-4763-a3d6-cffd5025d9d4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c7303214-658d-4763-a3d6-cffd5025d9d4\") " pod="openstack/cinder-scheduler-0" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.023168 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tppbb\" (UniqueName: \"kubernetes.io/projected/79a5f53b-d94a-405f-9ead-9d519f30a3dc-kube-api-access-tppbb\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.023217 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a5f53b-d94a-405f-9ead-9d519f30a3dc-logs\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.023244 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-scripts\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.023313 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-internal-tls-certs\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.023447 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-config-data\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.023487 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-combined-ca-bundle\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.023538 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-public-tls-certs\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.036315 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79a5f53b-d94a-405f-9ead-9d519f30a3dc-logs\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.037140 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-public-tls-certs\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.040412 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-internal-tls-certs\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.048379 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-combined-ca-bundle\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.049143 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-config-data\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.050779 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79a5f53b-d94a-405f-9ead-9d519f30a3dc-scripts\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.055805 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tppbb\" (UniqueName: \"kubernetes.io/projected/79a5f53b-d94a-405f-9ead-9d519f30a3dc-kube-api-access-tppbb\") pod \"placement-76f5474f44-brjsr\" (UID: \"79a5f53b-d94a-405f-9ead-9d519f30a3dc\") " pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.130766 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.223158 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.525966 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.606771 4918 generic.go:334] "Generic (PLEG): container finished" podID="46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" containerID="50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f" exitCode=0 Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.606941 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23bad2c2-b869-49db-9a5b-9dc2ad887973" path="/var/lib/kubelet/pods/23bad2c2-b869-49db-9a5b-9dc2ad887973/volumes" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.607912 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.611143 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" event={"ID":"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927","Type":"ContainerDied","Data":"50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f"} Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.611184 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-frrgq" event={"ID":"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927","Type":"ContainerDied","Data":"e195ebdc40227373c433e88d8a467c0b606ef63cf17581b323f84fad4aebd42b"} Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.611205 4918 scope.go:117] "RemoveContainer" containerID="50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.658096 4918 scope.go:117] "RemoveContainer" containerID="04dca1c2f0729c514c69aa2cc8c24d4ed9a77f2d56bbeec392ed324b8420f084" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.666011 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-nb\") pod \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.666074 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7dbl\" (UniqueName: \"kubernetes.io/projected/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-kube-api-access-s7dbl\") pod \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.666143 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-svc\") pod \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.666269 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-swift-storage-0\") pod \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.666296 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-config\") pod \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.666433 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-sb\") pod \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\" (UID: \"46b337a2-f3eb-48c0-8e66-bd5ce8bc4927\") " Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.695142 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-kube-api-access-s7dbl" (OuterVolumeSpecName: "kube-api-access-s7dbl") pod "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" (UID: "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927"). InnerVolumeSpecName "kube-api-access-s7dbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.765689 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" (UID: "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.769910 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7dbl\" (UniqueName: \"kubernetes.io/projected/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-kube-api-access-s7dbl\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.769950 4918 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.789730 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" (UID: "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.796034 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-config" (OuterVolumeSpecName: "config") pod "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" (UID: "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.818220 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" (UID: "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.839979 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" (UID: "46b337a2-f3eb-48c0-8e66-bd5ce8bc4927"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.872486 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.872513 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.872536 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:14 crc kubenswrapper[4918]: I0319 17:01:14.872546 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.003592 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-frrgq"] Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.006330 4918 scope.go:117] "RemoveContainer" containerID="50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f" Mar 19 17:01:15 crc kubenswrapper[4918]: E0319 17:01:15.009915 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f\": container with ID starting with 50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f not found: ID does not exist" containerID="50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.009977 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f"} err="failed to get container status \"50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f\": rpc error: code = NotFound desc = could not find container \"50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f\": container with ID starting with 50805efe2ce135dd9eb04f030d96ca6fb2832afc302ba9ce91defb099798ca5f not found: ID does not exist" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.010000 4918 scope.go:117] "RemoveContainer" containerID="04dca1c2f0729c514c69aa2cc8c24d4ed9a77f2d56bbeec392ed324b8420f084" Mar 19 17:01:15 crc kubenswrapper[4918]: E0319 17:01:15.012902 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04dca1c2f0729c514c69aa2cc8c24d4ed9a77f2d56bbeec392ed324b8420f084\": container with ID starting with 04dca1c2f0729c514c69aa2cc8c24d4ed9a77f2d56bbeec392ed324b8420f084 not found: ID does not exist" containerID="04dca1c2f0729c514c69aa2cc8c24d4ed9a77f2d56bbeec392ed324b8420f084" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.012936 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04dca1c2f0729c514c69aa2cc8c24d4ed9a77f2d56bbeec392ed324b8420f084"} err="failed to get container status \"04dca1c2f0729c514c69aa2cc8c24d4ed9a77f2d56bbeec392ed324b8420f084\": rpc error: code = NotFound desc = could not find container \"04dca1c2f0729c514c69aa2cc8c24d4ed9a77f2d56bbeec392ed324b8420f084\": container with ID starting with 04dca1c2f0729c514c69aa2cc8c24d4ed9a77f2d56bbeec392ed324b8420f084 not found: ID does not exist" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.017630 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-frrgq"] Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.127621 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.169468 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76f5474f44-brjsr"] Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.176727 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="178a9ae2-1774-4025-8951-93167e95f5d7" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.186:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.182633 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.455843 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-568c4fd78c-t5k2q" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.593885 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-678fb97f86-hlhbk" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.654899 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.678807 4918 generic.go:334] "Generic (PLEG): container finished" podID="8de72dd4-40e4-4d8e-b820-383e3c8e3734" containerID="9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2" exitCode=0 Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.678903 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8de72dd4-40e4-4d8e-b820-383e3c8e3734","Type":"ContainerDied","Data":"9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2"} Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.678930 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8de72dd4-40e4-4d8e-b820-383e3c8e3734","Type":"ContainerDied","Data":"a877a24105b8c7861df2ef57719149f3bc4b21c943b05e1abc05be1e48cd29c0"} Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.678947 4918 scope.go:117] "RemoveContainer" containerID="9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.679068 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-846b889554-z7r6b"] Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.679277 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-846b889554-z7r6b" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerName="barbican-api-log" containerID="cri-o://009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a" gracePeriod=30 Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.679564 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-846b889554-z7r6b" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerName="barbican-api" containerID="cri-o://f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206" gracePeriod=30 Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.715908 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f5474f44-brjsr" event={"ID":"79a5f53b-d94a-405f-9ead-9d519f30a3dc","Type":"ContainerStarted","Data":"72bdedda1b5bbe2b0001eb0e0ce5c0e8a5ca0a36832825ba21ee1b5ebcd64bc1"} Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.715953 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f5474f44-brjsr" event={"ID":"79a5f53b-d94a-405f-9ead-9d519f30a3dc","Type":"ContainerStarted","Data":"ef1d0ee8bf7b9a33c99a758673824e30b1966f2a7fdac7e4c48af426bcda4fff"} Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.716563 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdv88\" (UniqueName: \"kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-kube-api-access-tdv88\") pod \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.716650 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data\") pod \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.716688 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-scripts\") pod \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.716783 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data-custom\") pod \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.716870 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-certs\") pod \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.716896 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-combined-ca-bundle\") pod \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\" (UID: \"8de72dd4-40e4-4d8e-b820-383e3c8e3734\") " Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.733637 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-scripts" (OuterVolumeSpecName: "scripts") pod "8de72dd4-40e4-4d8e-b820-383e3c8e3734" (UID: "8de72dd4-40e4-4d8e-b820-383e3c8e3734"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.736658 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-certs" (OuterVolumeSpecName: "certs") pod "8de72dd4-40e4-4d8e-b820-383e3c8e3734" (UID: "8de72dd4-40e4-4d8e-b820-383e3c8e3734"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.738951 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-kube-api-access-tdv88" (OuterVolumeSpecName: "kube-api-access-tdv88") pod "8de72dd4-40e4-4d8e-b820-383e3c8e3734" (UID: "8de72dd4-40e4-4d8e-b820-383e3c8e3734"). InnerVolumeSpecName "kube-api-access-tdv88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.740929 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8de72dd4-40e4-4d8e-b820-383e3c8e3734" (UID: "8de72dd4-40e4-4d8e-b820-383e3c8e3734"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.753804 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c7303214-658d-4763-a3d6-cffd5025d9d4","Type":"ContainerStarted","Data":"ac5f8885006ad8f9da29d2080251a59db9c851c118149fc4db408e711fd6bceb"} Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.819236 4918 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.819268 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdv88\" (UniqueName: \"kubernetes.io/projected/8de72dd4-40e4-4d8e-b820-383e3c8e3734-kube-api-access-tdv88\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.819280 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.819289 4918 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.845563 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data" (OuterVolumeSpecName: "config-data") pod "8de72dd4-40e4-4d8e-b820-383e3c8e3734" (UID: "8de72dd4-40e4-4d8e-b820-383e3c8e3734"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.854192 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8de72dd4-40e4-4d8e-b820-383e3c8e3734" (UID: "8de72dd4-40e4-4d8e-b820-383e3c8e3734"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.920804 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.920838 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de72dd4-40e4-4d8e-b820-383e3c8e3734-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.954416 4918 scope.go:117] "RemoveContainer" containerID="9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2" Mar 19 17:01:15 crc kubenswrapper[4918]: E0319 17:01:15.955129 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2\": container with ID starting with 9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2 not found: ID does not exist" containerID="9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2" Mar 19 17:01:15 crc kubenswrapper[4918]: I0319 17:01:15.955165 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2"} err="failed to get container status \"9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2\": rpc error: code = NotFound desc = could not find container \"9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2\": container with ID starting with 9d862b61eaac5d1b760bbd841ff4e44cc492f12b65475b983dbb54405e23f7d2 not found: ID does not exist" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.600758 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" path="/var/lib/kubelet/pods/46b337a2-f3eb-48c0-8e66-bd5ce8bc4927/volumes" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.816804 4918 generic.go:334] "Generic (PLEG): container finished" podID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerID="009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a" exitCode=143 Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.816864 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-846b889554-z7r6b" event={"ID":"bf259f2d-395e-4d36-bdc0-2c01310e24e8","Type":"ContainerDied","Data":"009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a"} Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.820293 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.827840 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f5474f44-brjsr" event={"ID":"79a5f53b-d94a-405f-9ead-9d519f30a3dc","Type":"ContainerStarted","Data":"04ec0e2692dc274983ac9b26d85d5824e4da1827827d88e7d1133892352c39cf"} Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.828598 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.828621 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.829927 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c7303214-658d-4763-a3d6-cffd5025d9d4","Type":"ContainerStarted","Data":"b3f383de72ff5839b650f7519ead60113c7651aea7e28f7371bf76f6cdf8bc63"} Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.844308 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.855187 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.877437 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76f5474f44-brjsr" podStartSLOduration=3.877420818 podStartE2EDuration="3.877420818s" podCreationTimestamp="2026-03-19 17:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:16.869854871 +0000 UTC m=+1288.992054119" watchObservedRunningTime="2026-03-19 17:01:16.877420818 +0000 UTC m=+1288.999620056" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.882142 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:01:16 crc kubenswrapper[4918]: E0319 17:01:16.882612 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de72dd4-40e4-4d8e-b820-383e3c8e3734" containerName="cloudkitty-proc" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.882631 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de72dd4-40e4-4d8e-b820-383e3c8e3734" containerName="cloudkitty-proc" Mar 19 17:01:16 crc kubenswrapper[4918]: E0319 17:01:16.882661 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" containerName="init" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.882668 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" containerName="init" Mar 19 17:01:16 crc kubenswrapper[4918]: E0319 17:01:16.882688 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" containerName="dnsmasq-dns" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.882694 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" containerName="dnsmasq-dns" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.882856 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de72dd4-40e4-4d8e-b820-383e3c8e3734" containerName="cloudkitty-proc" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.882879 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b337a2-f3eb-48c0-8e66-bd5ce8bc4927" containerName="dnsmasq-dns" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.883567 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.890019 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.908537 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.947236 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.947328 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-scripts\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.947364 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.947410 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-certs\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.947441 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:16 crc kubenswrapper[4918]: I0319 17:01:16.947602 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dc5x\" (UniqueName: \"kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-kube-api-access-2dc5x\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.052007 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-scripts\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.052058 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.052108 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-certs\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.052130 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.052215 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dc5x\" (UniqueName: \"kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-kube-api-access-2dc5x\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.052239 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.060149 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.064002 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.064192 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.064555 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-certs\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.072603 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dc5x\" (UniqueName: \"kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-kube-api-access-2dc5x\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.073798 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-scripts\") pod \"cloudkitty-proc-0\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.247962 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.321518 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.322873 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.325761 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-lv82m" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.326040 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.328694 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.346793 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.359588 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789c957f-feb3-4f8c-83fa-3524740a2c8d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.359650 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/789c957f-feb3-4f8c-83fa-3524740a2c8d-openstack-config-secret\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.359791 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/789c957f-feb3-4f8c-83fa-3524740a2c8d-openstack-config\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.359813 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tkb2\" (UniqueName: \"kubernetes.io/projected/789c957f-feb3-4f8c-83fa-3524740a2c8d-kube-api-access-8tkb2\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.462727 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/789c957f-feb3-4f8c-83fa-3524740a2c8d-openstack-config\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.463095 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tkb2\" (UniqueName: \"kubernetes.io/projected/789c957f-feb3-4f8c-83fa-3524740a2c8d-kube-api-access-8tkb2\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.463141 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789c957f-feb3-4f8c-83fa-3524740a2c8d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.463182 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/789c957f-feb3-4f8c-83fa-3524740a2c8d-openstack-config-secret\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.463959 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/789c957f-feb3-4f8c-83fa-3524740a2c8d-openstack-config\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.472132 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/789c957f-feb3-4f8c-83fa-3524740a2c8d-openstack-config-secret\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.472334 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789c957f-feb3-4f8c-83fa-3524740a2c8d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.484569 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tkb2\" (UniqueName: \"kubernetes.io/projected/789c957f-feb3-4f8c-83fa-3524740a2c8d-kube-api-access-8tkb2\") pod \"openstackclient\" (UID: \"789c957f-feb3-4f8c-83fa-3524740a2c8d\") " pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.707444 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.839114 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:01:17 crc kubenswrapper[4918]: W0319 17:01:17.846124 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bb4650a_c965_49da_b13a_9e998a165c45.slice/crio-5c11d4e8950988eb8aa750281f37e0c83d87888f716aaae835ce79d511697c64 WatchSource:0}: Error finding container 5c11d4e8950988eb8aa750281f37e0c83d87888f716aaae835ce79d511697c64: Status 404 returned error can't find the container with id 5c11d4e8950988eb8aa750281f37e0c83d87888f716aaae835ce79d511697c64 Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.893806 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c7303214-658d-4763-a3d6-cffd5025d9d4","Type":"ContainerStarted","Data":"a4caf437e015e51d24596027592de51ef3fd60ea92e60c87bfc03095afef650a"} Mar 19 17:01:17 crc kubenswrapper[4918]: I0319 17:01:17.952508 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.952490297 podStartE2EDuration="4.952490297s" podCreationTimestamp="2026-03-19 17:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:17.92705013 +0000 UTC m=+1290.049249378" watchObservedRunningTime="2026-03-19 17:01:17.952490297 +0000 UTC m=+1290.074689545" Mar 19 17:01:18 crc kubenswrapper[4918]: W0319 17:01:18.276116 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod789c957f_feb3_4f8c_83fa_3524740a2c8d.slice/crio-085b35582e21dfcf933a9505177feb0ce12004dea08dc05df692326898ced531 WatchSource:0}: Error finding container 085b35582e21dfcf933a9505177feb0ce12004dea08dc05df692326898ced531: Status 404 returned error can't find the container with id 085b35582e21dfcf933a9505177feb0ce12004dea08dc05df692326898ced531 Mar 19 17:01:18 crc kubenswrapper[4918]: I0319 17:01:18.293644 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 17:01:18 crc kubenswrapper[4918]: I0319 17:01:18.623098 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de72dd4-40e4-4d8e-b820-383e3c8e3734" path="/var/lib/kubelet/pods/8de72dd4-40e4-4d8e-b820-383e3c8e3734/volumes" Mar 19 17:01:18 crc kubenswrapper[4918]: I0319 17:01:18.903646 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"789c957f-feb3-4f8c-83fa-3524740a2c8d","Type":"ContainerStarted","Data":"085b35582e21dfcf933a9505177feb0ce12004dea08dc05df692326898ced531"} Mar 19 17:01:18 crc kubenswrapper[4918]: I0319 17:01:18.905606 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"9bb4650a-c965-49da-b13a-9e998a165c45","Type":"ContainerStarted","Data":"fd476c48bb042b4ff2ca594afe44e17b85b66ad715939f92688ab8a3bc41e97b"} Mar 19 17:01:18 crc kubenswrapper[4918]: I0319 17:01:18.905632 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"9bb4650a-c965-49da-b13a-9e998a165c45","Type":"ContainerStarted","Data":"5c11d4e8950988eb8aa750281f37e0c83d87888f716aaae835ce79d511697c64"} Mar 19 17:01:18 crc kubenswrapper[4918]: I0319 17:01:18.924481 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.924464891 podStartE2EDuration="2.924464891s" podCreationTimestamp="2026-03-19 17:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:18.921162341 +0000 UTC m=+1291.043361589" watchObservedRunningTime="2026-03-19 17:01:18.924464891 +0000 UTC m=+1291.046664139" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.131958 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.751673 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.818247 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-combined-ca-bundle\") pod \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.818506 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data\") pod \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.818614 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data-custom\") pod \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.818660 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf259f2d-395e-4d36-bdc0-2c01310e24e8-logs\") pod \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.818738 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkqq7\" (UniqueName: \"kubernetes.io/projected/bf259f2d-395e-4d36-bdc0-2c01310e24e8-kube-api-access-lkqq7\") pod \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\" (UID: \"bf259f2d-395e-4d36-bdc0-2c01310e24e8\") " Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.826372 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bf259f2d-395e-4d36-bdc0-2c01310e24e8" (UID: "bf259f2d-395e-4d36-bdc0-2c01310e24e8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.826924 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf259f2d-395e-4d36-bdc0-2c01310e24e8-logs" (OuterVolumeSpecName: "logs") pod "bf259f2d-395e-4d36-bdc0-2c01310e24e8" (UID: "bf259f2d-395e-4d36-bdc0-2c01310e24e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.829987 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf259f2d-395e-4d36-bdc0-2c01310e24e8-kube-api-access-lkqq7" (OuterVolumeSpecName: "kube-api-access-lkqq7") pod "bf259f2d-395e-4d36-bdc0-2c01310e24e8" (UID: "bf259f2d-395e-4d36-bdc0-2c01310e24e8"). InnerVolumeSpecName "kube-api-access-lkqq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.872732 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf259f2d-395e-4d36-bdc0-2c01310e24e8" (UID: "bf259f2d-395e-4d36-bdc0-2c01310e24e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.916417 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data" (OuterVolumeSpecName: "config-data") pod "bf259f2d-395e-4d36-bdc0-2c01310e24e8" (UID: "bf259f2d-395e-4d36-bdc0-2c01310e24e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.918984 4918 generic.go:334] "Generic (PLEG): container finished" podID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerID="f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206" exitCode=0 Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.919058 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-846b889554-z7r6b" event={"ID":"bf259f2d-395e-4d36-bdc0-2c01310e24e8","Type":"ContainerDied","Data":"f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206"} Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.919084 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-846b889554-z7r6b" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.919107 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-846b889554-z7r6b" event={"ID":"bf259f2d-395e-4d36-bdc0-2c01310e24e8","Type":"ContainerDied","Data":"53a649fe1b19158a0ab503c85e356bb06b06b415d2ad5505b01bbd49b1b164d1"} Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.919130 4918 scope.go:117] "RemoveContainer" containerID="f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.924499 4918 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.924551 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf259f2d-395e-4d36-bdc0-2c01310e24e8-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.924567 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkqq7\" (UniqueName: \"kubernetes.io/projected/bf259f2d-395e-4d36-bdc0-2c01310e24e8-kube-api-access-lkqq7\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.924576 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:19 crc kubenswrapper[4918]: I0319 17:01:19.924585 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf259f2d-395e-4d36-bdc0-2c01310e24e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:20 crc kubenswrapper[4918]: I0319 17:01:20.013316 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-846b889554-z7r6b"] Mar 19 17:01:20 crc kubenswrapper[4918]: I0319 17:01:20.031880 4918 scope.go:117] "RemoveContainer" containerID="009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a" Mar 19 17:01:20 crc kubenswrapper[4918]: I0319 17:01:20.035159 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-846b889554-z7r6b"] Mar 19 17:01:20 crc kubenswrapper[4918]: I0319 17:01:20.073939 4918 scope.go:117] "RemoveContainer" containerID="f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206" Mar 19 17:01:20 crc kubenswrapper[4918]: E0319 17:01:20.074652 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206\": container with ID starting with f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206 not found: ID does not exist" containerID="f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206" Mar 19 17:01:20 crc kubenswrapper[4918]: I0319 17:01:20.074814 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206"} err="failed to get container status \"f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206\": rpc error: code = NotFound desc = could not find container \"f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206\": container with ID starting with f01037dad64377a391a2f4b86658f3f2bc0c282891b2d41251b93fab88d22206 not found: ID does not exist" Mar 19 17:01:20 crc kubenswrapper[4918]: I0319 17:01:20.074922 4918 scope.go:117] "RemoveContainer" containerID="009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a" Mar 19 17:01:20 crc kubenswrapper[4918]: E0319 17:01:20.075269 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a\": container with ID starting with 009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a not found: ID does not exist" containerID="009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a" Mar 19 17:01:20 crc kubenswrapper[4918]: I0319 17:01:20.075372 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a"} err="failed to get container status \"009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a\": rpc error: code = NotFound desc = could not find container \"009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a\": container with ID starting with 009d1b9ceb70a06a2d6f926079d629eff34e33b0ac25dc5002da39fc9caaca7a not found: ID does not exist" Mar 19 17:01:20 crc kubenswrapper[4918]: E0319 17:01:20.352336 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf259f2d_395e_4d36_bdc0_2c01310e24e8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf259f2d_395e_4d36_bdc0_2c01310e24e8.slice/crio-53a649fe1b19158a0ab503c85e356bb06b06b415d2ad5505b01bbd49b1b164d1\": RecentStats: unable to find data in memory cache]" Mar 19 17:01:20 crc kubenswrapper[4918]: I0319 17:01:20.600272 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" path="/var/lib/kubelet/pods/bf259f2d-395e-4d36-bdc0-2c01310e24e8/volumes" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.382872 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-559c86bbbc-4zd54"] Mar 19 17:01:24 crc kubenswrapper[4918]: E0319 17:01:24.383944 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerName="barbican-api-log" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.383958 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerName="barbican-api-log" Mar 19 17:01:24 crc kubenswrapper[4918]: E0319 17:01:24.383981 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerName="barbican-api" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.383988 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerName="barbican-api" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.384347 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerName="barbican-api-log" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.384369 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerName="barbican-api" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.402329 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.409687 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.409899 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.419834 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.427623 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-846b889554-z7r6b" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.183:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.429734 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-846b889554-z7r6b" podUID="bf259f2d-395e-4d36-bdc0-2c01310e24e8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.183:9311/healthcheck\": dial tcp 10.217.0.183:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.447919 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/867ef803-7e55-4f6e-83ff-94f3534387a9-log-httpd\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.448181 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/867ef803-7e55-4f6e-83ff-94f3534387a9-run-httpd\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.448303 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-public-tls-certs\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.448415 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-combined-ca-bundle\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.448562 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-internal-tls-certs\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.448711 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/867ef803-7e55-4f6e-83ff-94f3534387a9-etc-swift\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.460675 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8kpq\" (UniqueName: \"kubernetes.io/projected/867ef803-7e55-4f6e-83ff-94f3534387a9-kube-api-access-c8kpq\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.461099 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-config-data\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.484711 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-559c86bbbc-4zd54"] Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.563681 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-public-tls-certs\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.563980 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-combined-ca-bundle\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.564902 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-internal-tls-certs\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.565703 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/867ef803-7e55-4f6e-83ff-94f3534387a9-etc-swift\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.565776 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8kpq\" (UniqueName: \"kubernetes.io/projected/867ef803-7e55-4f6e-83ff-94f3534387a9-kube-api-access-c8kpq\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.565945 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-config-data\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.566233 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/867ef803-7e55-4f6e-83ff-94f3534387a9-log-httpd\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.566285 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/867ef803-7e55-4f6e-83ff-94f3534387a9-run-httpd\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.567051 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/867ef803-7e55-4f6e-83ff-94f3534387a9-run-httpd\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.571704 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/867ef803-7e55-4f6e-83ff-94f3534387a9-log-httpd\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.577570 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-internal-tls-certs\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.581876 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/867ef803-7e55-4f6e-83ff-94f3534387a9-etc-swift\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.582694 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-combined-ca-bundle\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.592410 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-public-tls-certs\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.609566 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ef803-7e55-4f6e-83ff-94f3534387a9-config-data\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.610480 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8kpq\" (UniqueName: \"kubernetes.io/projected/867ef803-7e55-4f6e-83ff-94f3534387a9-kube-api-access-c8kpq\") pod \"swift-proxy-559c86bbbc-4zd54\" (UID: \"867ef803-7e55-4f6e-83ff-94f3534387a9\") " pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.688088 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 17:01:24 crc kubenswrapper[4918]: I0319 17:01:24.746508 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:25 crc kubenswrapper[4918]: I0319 17:01:25.375796 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-559c86bbbc-4zd54"] Mar 19 17:01:26 crc kubenswrapper[4918]: I0319 17:01:25.996758 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-559c86bbbc-4zd54" event={"ID":"867ef803-7e55-4f6e-83ff-94f3534387a9","Type":"ContainerStarted","Data":"eca497d85bf769f76507ed70a2e4df12b6e45f2ab5cc9f65d16c8c833a934615"} Mar 19 17:01:26 crc kubenswrapper[4918]: I0319 17:01:25.997778 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-559c86bbbc-4zd54" event={"ID":"867ef803-7e55-4f6e-83ff-94f3534387a9","Type":"ContainerStarted","Data":"760777ca764ab5f5d2ab5042270a82b0fec02b7bad67d8e8b82114e047968667"} Mar 19 17:01:26 crc kubenswrapper[4918]: I0319 17:01:25.997811 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-559c86bbbc-4zd54" event={"ID":"867ef803-7e55-4f6e-83ff-94f3534387a9","Type":"ContainerStarted","Data":"0c37bf0c7b9aa1e637dd1ed9ea1e43f83931ef888e7924304fed46cecf4a5a4e"} Mar 19 17:01:26 crc kubenswrapper[4918]: I0319 17:01:25.998747 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:26 crc kubenswrapper[4918]: I0319 17:01:25.998900 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:26 crc kubenswrapper[4918]: I0319 17:01:26.020504 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-559c86bbbc-4zd54" podStartSLOduration=2.020487989 podStartE2EDuration="2.020487989s" podCreationTimestamp="2026-03-19 17:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:26.020105288 +0000 UTC m=+1298.142304526" watchObservedRunningTime="2026-03-19 17:01:26.020487989 +0000 UTC m=+1298.142687237" Mar 19 17:01:27 crc kubenswrapper[4918]: I0319 17:01:27.426822 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-54486b455c-k7jwz" Mar 19 17:01:27 crc kubenswrapper[4918]: I0319 17:01:27.496759 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bb7fd774d-vnxdq"] Mar 19 17:01:27 crc kubenswrapper[4918]: I0319 17:01:27.496996 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bb7fd774d-vnxdq" podUID="0825706f-8acf-485c-82dd-d5c672b187e8" containerName="neutron-api" containerID="cri-o://5a1e47eb79aaeb3b5c7b2d0aa6f667ef89dded02f15708716aab3f5c0d6bbb07" gracePeriod=30 Mar 19 17:01:27 crc kubenswrapper[4918]: I0319 17:01:27.497507 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bb7fd774d-vnxdq" podUID="0825706f-8acf-485c-82dd-d5c672b187e8" containerName="neutron-httpd" containerID="cri-o://048285cd1e307a5250686ca0ca9b49aea5077e3a65801edc8ea037c862608c52" gracePeriod=30 Mar 19 17:01:27 crc kubenswrapper[4918]: I0319 17:01:27.931943 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:27 crc kubenswrapper[4918]: I0319 17:01:27.932253 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="ceilometer-central-agent" containerID="cri-o://d5c22a25ed50a8f736348edc2e40564c0f01803e56ab5de8d3a634c6669ce834" gracePeriod=30 Mar 19 17:01:27 crc kubenswrapper[4918]: I0319 17:01:27.932315 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="sg-core" containerID="cri-o://09cf7a30a722680444026776007f6b7f843254948f2473f31360c0f51c8d7dc4" gracePeriod=30 Mar 19 17:01:27 crc kubenswrapper[4918]: I0319 17:01:27.932346 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="ceilometer-notification-agent" containerID="cri-o://ec9fe3e21f4df9bdfaa42364eed81825c8d6dc7a894fa20c56160687357261c2" gracePeriod=30 Mar 19 17:01:27 crc kubenswrapper[4918]: I0319 17:01:27.932354 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="proxy-httpd" containerID="cri-o://738ad7ddff566c1bcf4fae1c9adc62034094f6f32e93bf3022b7ccefa9f282b0" gracePeriod=30 Mar 19 17:01:27 crc kubenswrapper[4918]: I0319 17:01:27.937249 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 17:01:28 crc kubenswrapper[4918]: I0319 17:01:28.023338 4918 generic.go:334] "Generic (PLEG): container finished" podID="0825706f-8acf-485c-82dd-d5c672b187e8" containerID="048285cd1e307a5250686ca0ca9b49aea5077e3a65801edc8ea037c862608c52" exitCode=0 Mar 19 17:01:28 crc kubenswrapper[4918]: I0319 17:01:28.023386 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb7fd774d-vnxdq" event={"ID":"0825706f-8acf-485c-82dd-d5c672b187e8","Type":"ContainerDied","Data":"048285cd1e307a5250686ca0ca9b49aea5077e3a65801edc8ea037c862608c52"} Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.036977 4918 generic.go:334] "Generic (PLEG): container finished" podID="78d493df-532a-4203-9264-86e66bf964f0" containerID="738ad7ddff566c1bcf4fae1c9adc62034094f6f32e93bf3022b7ccefa9f282b0" exitCode=0 Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.037298 4918 generic.go:334] "Generic (PLEG): container finished" podID="78d493df-532a-4203-9264-86e66bf964f0" containerID="09cf7a30a722680444026776007f6b7f843254948f2473f31360c0f51c8d7dc4" exitCode=2 Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.037305 4918 generic.go:334] "Generic (PLEG): container finished" podID="78d493df-532a-4203-9264-86e66bf964f0" containerID="ec9fe3e21f4df9bdfaa42364eed81825c8d6dc7a894fa20c56160687357261c2" exitCode=0 Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.037313 4918 generic.go:334] "Generic (PLEG): container finished" podID="78d493df-532a-4203-9264-86e66bf964f0" containerID="d5c22a25ed50a8f736348edc2e40564c0f01803e56ab5de8d3a634c6669ce834" exitCode=0 Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.037088 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78d493df-532a-4203-9264-86e66bf964f0","Type":"ContainerDied","Data":"738ad7ddff566c1bcf4fae1c9adc62034094f6f32e93bf3022b7ccefa9f282b0"} Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.037371 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78d493df-532a-4203-9264-86e66bf964f0","Type":"ContainerDied","Data":"09cf7a30a722680444026776007f6b7f843254948f2473f31360c0f51c8d7dc4"} Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.037383 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78d493df-532a-4203-9264-86e66bf964f0","Type":"ContainerDied","Data":"ec9fe3e21f4df9bdfaa42364eed81825c8d6dc7a894fa20c56160687357261c2"} Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.037393 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78d493df-532a-4203-9264-86e66bf964f0","Type":"ContainerDied","Data":"d5c22a25ed50a8f736348edc2e40564c0f01803e56ab5de8d3a634c6669ce834"} Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.039309 4918 generic.go:334] "Generic (PLEG): container finished" podID="178a9ae2-1774-4025-8951-93167e95f5d7" containerID="d3cb418db6fdfe2f8e5fcd5661364f8f04b064e798a1a7006adf3e06922b7e9e" exitCode=137 Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.039347 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"178a9ae2-1774-4025-8951-93167e95f5d7","Type":"ContainerDied","Data":"d3cb418db6fdfe2f8e5fcd5661364f8f04b064e798a1a7006adf3e06922b7e9e"} Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.867986 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.868409 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eee99f54-a76f-416d-a14f-cebf9d11548b" containerName="glance-httpd" containerID="cri-o://dc38a6c1fb8ac5da62997f1e892f5e7db87834e40181a874ebadb17fb75aa042" gracePeriod=30 Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.868279 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="eee99f54-a76f-416d-a14f-cebf9d11548b" containerName="glance-log" containerID="cri-o://0c9edc6b8da1c8f4395f86c0760b11a1d0e50157fc2e91f2b5e58852148640c8" gracePeriod=30 Mar 19 17:01:29 crc kubenswrapper[4918]: I0319 17:01:29.984811 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.189:3000/\": dial tcp 10.217.0.189:3000: connect: connection refused" Mar 19 17:01:30 crc kubenswrapper[4918]: I0319 17:01:30.050079 4918 generic.go:334] "Generic (PLEG): container finished" podID="eee99f54-a76f-416d-a14f-cebf9d11548b" containerID="0c9edc6b8da1c8f4395f86c0760b11a1d0e50157fc2e91f2b5e58852148640c8" exitCode=143 Mar 19 17:01:30 crc kubenswrapper[4918]: I0319 17:01:30.050133 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eee99f54-a76f-416d-a14f-cebf9d11548b","Type":"ContainerDied","Data":"0c9edc6b8da1c8f4395f86c0760b11a1d0e50157fc2e91f2b5e58852148640c8"} Mar 19 17:01:30 crc kubenswrapper[4918]: I0319 17:01:30.175819 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="178a9ae2-1774-4025-8951-93167e95f5d7" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.186:8776/healthcheck\": dial tcp 10.217.0.186:8776: connect: connection refused" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.561687 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-s8jqb"] Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.563349 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-s8jqb" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.578234 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-s8jqb"] Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.664458 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7pjs\" (UniqueName: \"kubernetes.io/projected/7c57e4da-677f-401e-b718-b25a5678a352-kube-api-access-l7pjs\") pod \"nova-api-db-create-s8jqb\" (UID: \"7c57e4da-677f-401e-b718-b25a5678a352\") " pod="openstack/nova-api-db-create-s8jqb" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.665229 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c57e4da-677f-401e-b718-b25a5678a352-operator-scripts\") pod \"nova-api-db-create-s8jqb\" (UID: \"7c57e4da-677f-401e-b718-b25a5678a352\") " pod="openstack/nova-api-db-create-s8jqb" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.767457 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c57e4da-677f-401e-b718-b25a5678a352-operator-scripts\") pod \"nova-api-db-create-s8jqb\" (UID: \"7c57e4da-677f-401e-b718-b25a5678a352\") " pod="openstack/nova-api-db-create-s8jqb" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.767687 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7pjs\" (UniqueName: \"kubernetes.io/projected/7c57e4da-677f-401e-b718-b25a5678a352-kube-api-access-l7pjs\") pod \"nova-api-db-create-s8jqb\" (UID: \"7c57e4da-677f-401e-b718-b25a5678a352\") " pod="openstack/nova-api-db-create-s8jqb" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.768096 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-6z7jz"] Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.768880 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c57e4da-677f-401e-b718-b25a5678a352-operator-scripts\") pod \"nova-api-db-create-s8jqb\" (UID: \"7c57e4da-677f-401e-b718-b25a5678a352\") " pod="openstack/nova-api-db-create-s8jqb" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.769641 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6z7jz" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.788385 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1eef-account-create-update-mz92v"] Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.790383 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1eef-account-create-update-mz92v" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.796906 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.806985 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6z7jz"] Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.808594 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7pjs\" (UniqueName: \"kubernetes.io/projected/7c57e4da-677f-401e-b718-b25a5678a352-kube-api-access-l7pjs\") pod \"nova-api-db-create-s8jqb\" (UID: \"7c57e4da-677f-401e-b718-b25a5678a352\") " pod="openstack/nova-api-db-create-s8jqb" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.817804 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1eef-account-create-update-mz92v"] Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.869317 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-operator-scripts\") pod \"nova-api-1eef-account-create-update-mz92v\" (UID: \"d53b74e8-6505-42fc-bbb6-9f9f2b96f747\") " pod="openstack/nova-api-1eef-account-create-update-mz92v" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.869560 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b2236d3-b352-4383-b099-0a1b39bdf222-operator-scripts\") pod \"nova-cell0-db-create-6z7jz\" (UID: \"4b2236d3-b352-4383-b099-0a1b39bdf222\") " pod="openstack/nova-cell0-db-create-6z7jz" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.869742 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xqmg\" (UniqueName: \"kubernetes.io/projected/4b2236d3-b352-4383-b099-0a1b39bdf222-kube-api-access-8xqmg\") pod \"nova-cell0-db-create-6z7jz\" (UID: \"4b2236d3-b352-4383-b099-0a1b39bdf222\") " pod="openstack/nova-cell0-db-create-6z7jz" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.869802 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmcvm\" (UniqueName: \"kubernetes.io/projected/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-kube-api-access-kmcvm\") pod \"nova-api-1eef-account-create-update-mz92v\" (UID: \"d53b74e8-6505-42fc-bbb6-9f9f2b96f747\") " pod="openstack/nova-api-1eef-account-create-update-mz92v" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.881617 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-s8jqb" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.920075 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-w4vzm"] Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.921444 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w4vzm" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.935361 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w4vzm"] Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.974602 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-operator-scripts\") pod \"nova-api-1eef-account-create-update-mz92v\" (UID: \"d53b74e8-6505-42fc-bbb6-9f9f2b96f747\") " pod="openstack/nova-api-1eef-account-create-update-mz92v" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.974765 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b2236d3-b352-4383-b099-0a1b39bdf222-operator-scripts\") pod \"nova-cell0-db-create-6z7jz\" (UID: \"4b2236d3-b352-4383-b099-0a1b39bdf222\") " pod="openstack/nova-cell0-db-create-6z7jz" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.974865 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xqmg\" (UniqueName: \"kubernetes.io/projected/4b2236d3-b352-4383-b099-0a1b39bdf222-kube-api-access-8xqmg\") pod \"nova-cell0-db-create-6z7jz\" (UID: \"4b2236d3-b352-4383-b099-0a1b39bdf222\") " pod="openstack/nova-cell0-db-create-6z7jz" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.974920 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmcvm\" (UniqueName: \"kubernetes.io/projected/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-kube-api-access-kmcvm\") pod \"nova-api-1eef-account-create-update-mz92v\" (UID: \"d53b74e8-6505-42fc-bbb6-9f9f2b96f747\") " pod="openstack/nova-api-1eef-account-create-update-mz92v" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.975410 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-operator-scripts\") pod \"nova-api-1eef-account-create-update-mz92v\" (UID: \"d53b74e8-6505-42fc-bbb6-9f9f2b96f747\") " pod="openstack/nova-api-1eef-account-create-update-mz92v" Mar 19 17:01:32 crc kubenswrapper[4918]: I0319 17:01:32.976021 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b2236d3-b352-4383-b099-0a1b39bdf222-operator-scripts\") pod \"nova-cell0-db-create-6z7jz\" (UID: \"4b2236d3-b352-4383-b099-0a1b39bdf222\") " pod="openstack/nova-cell0-db-create-6z7jz" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.002352 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fbfd-account-create-update-cvxhj"] Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.003931 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.007306 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.014500 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmcvm\" (UniqueName: \"kubernetes.io/projected/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-kube-api-access-kmcvm\") pod \"nova-api-1eef-account-create-update-mz92v\" (UID: \"d53b74e8-6505-42fc-bbb6-9f9f2b96f747\") " pod="openstack/nova-api-1eef-account-create-update-mz92v" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.015202 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fbfd-account-create-update-cvxhj"] Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.023729 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xqmg\" (UniqueName: \"kubernetes.io/projected/4b2236d3-b352-4383-b099-0a1b39bdf222-kube-api-access-8xqmg\") pod \"nova-cell0-db-create-6z7jz\" (UID: \"4b2236d3-b352-4383-b099-0a1b39bdf222\") " pod="openstack/nova-cell0-db-create-6z7jz" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.092121 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ceabd21-35af-489d-abc9-b4e8b629efd9-operator-scripts\") pod \"nova-cell1-db-create-w4vzm\" (UID: \"8ceabd21-35af-489d-abc9-b4e8b629efd9\") " pod="openstack/nova-cell1-db-create-w4vzm" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.092175 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmj6d\" (UniqueName: \"kubernetes.io/projected/00939ae6-93f7-437d-904c-53eaf4c4fc52-kube-api-access-zmj6d\") pod \"nova-cell0-fbfd-account-create-update-cvxhj\" (UID: \"00939ae6-93f7-437d-904c-53eaf4c4fc52\") " pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.092259 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00939ae6-93f7-437d-904c-53eaf4c4fc52-operator-scripts\") pod \"nova-cell0-fbfd-account-create-update-cvxhj\" (UID: \"00939ae6-93f7-437d-904c-53eaf4c4fc52\") " pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.092342 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm4wf\" (UniqueName: \"kubernetes.io/projected/8ceabd21-35af-489d-abc9-b4e8b629efd9-kube-api-access-lm4wf\") pod \"nova-cell1-db-create-w4vzm\" (UID: \"8ceabd21-35af-489d-abc9-b4e8b629efd9\") " pod="openstack/nova-cell1-db-create-w4vzm" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.108796 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6z7jz" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.110614 4918 generic.go:334] "Generic (PLEG): container finished" podID="0825706f-8acf-485c-82dd-d5c672b187e8" containerID="5a1e47eb79aaeb3b5c7b2d0aa6f667ef89dded02f15708716aab3f5c0d6bbb07" exitCode=0 Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.110665 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb7fd774d-vnxdq" event={"ID":"0825706f-8acf-485c-82dd-d5c672b187e8","Type":"ContainerDied","Data":"5a1e47eb79aaeb3b5c7b2d0aa6f667ef89dded02f15708716aab3f5c0d6bbb07"} Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.181324 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1eef-account-create-update-mz92v" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.186798 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-10c1-account-create-update-qpxpb"] Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.191497 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.196136 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.198242 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ceabd21-35af-489d-abc9-b4e8b629efd9-operator-scripts\") pod \"nova-cell1-db-create-w4vzm\" (UID: \"8ceabd21-35af-489d-abc9-b4e8b629efd9\") " pod="openstack/nova-cell1-db-create-w4vzm" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.198305 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmj6d\" (UniqueName: \"kubernetes.io/projected/00939ae6-93f7-437d-904c-53eaf4c4fc52-kube-api-access-zmj6d\") pod \"nova-cell0-fbfd-account-create-update-cvxhj\" (UID: \"00939ae6-93f7-437d-904c-53eaf4c4fc52\") " pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.198390 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00939ae6-93f7-437d-904c-53eaf4c4fc52-operator-scripts\") pod \"nova-cell0-fbfd-account-create-update-cvxhj\" (UID: \"00939ae6-93f7-437d-904c-53eaf4c4fc52\") " pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.198480 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm4wf\" (UniqueName: \"kubernetes.io/projected/8ceabd21-35af-489d-abc9-b4e8b629efd9-kube-api-access-lm4wf\") pod \"nova-cell1-db-create-w4vzm\" (UID: \"8ceabd21-35af-489d-abc9-b4e8b629efd9\") " pod="openstack/nova-cell1-db-create-w4vzm" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.199710 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ceabd21-35af-489d-abc9-b4e8b629efd9-operator-scripts\") pod \"nova-cell1-db-create-w4vzm\" (UID: \"8ceabd21-35af-489d-abc9-b4e8b629efd9\") " pod="openstack/nova-cell1-db-create-w4vzm" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.200416 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00939ae6-93f7-437d-904c-53eaf4c4fc52-operator-scripts\") pod \"nova-cell0-fbfd-account-create-update-cvxhj\" (UID: \"00939ae6-93f7-437d-904c-53eaf4c4fc52\") " pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.223849 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-10c1-account-create-update-qpxpb"] Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.226999 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmj6d\" (UniqueName: \"kubernetes.io/projected/00939ae6-93f7-437d-904c-53eaf4c4fc52-kube-api-access-zmj6d\") pod \"nova-cell0-fbfd-account-create-update-cvxhj\" (UID: \"00939ae6-93f7-437d-904c-53eaf4c4fc52\") " pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.229446 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm4wf\" (UniqueName: \"kubernetes.io/projected/8ceabd21-35af-489d-abc9-b4e8b629efd9-kube-api-access-lm4wf\") pod \"nova-cell1-db-create-w4vzm\" (UID: \"8ceabd21-35af-489d-abc9-b4e8b629efd9\") " pod="openstack/nova-cell1-db-create-w4vzm" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.257904 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w4vzm" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.301029 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/928c9b0d-6f27-4359-8b86-794d73ea9cd5-operator-scripts\") pod \"nova-cell1-10c1-account-create-update-qpxpb\" (UID: \"928c9b0d-6f27-4359-8b86-794d73ea9cd5\") " pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.301117 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnmlv\" (UniqueName: \"kubernetes.io/projected/928c9b0d-6f27-4359-8b86-794d73ea9cd5-kube-api-access-nnmlv\") pod \"nova-cell1-10c1-account-create-update-qpxpb\" (UID: \"928c9b0d-6f27-4359-8b86-794d73ea9cd5\") " pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.403224 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/928c9b0d-6f27-4359-8b86-794d73ea9cd5-operator-scripts\") pod \"nova-cell1-10c1-account-create-update-qpxpb\" (UID: \"928c9b0d-6f27-4359-8b86-794d73ea9cd5\") " pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.403611 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnmlv\" (UniqueName: \"kubernetes.io/projected/928c9b0d-6f27-4359-8b86-794d73ea9cd5-kube-api-access-nnmlv\") pod \"nova-cell1-10c1-account-create-update-qpxpb\" (UID: \"928c9b0d-6f27-4359-8b86-794d73ea9cd5\") " pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.403961 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/928c9b0d-6f27-4359-8b86-794d73ea9cd5-operator-scripts\") pod \"nova-cell1-10c1-account-create-update-qpxpb\" (UID: \"928c9b0d-6f27-4359-8b86-794d73ea9cd5\") " pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.429285 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnmlv\" (UniqueName: \"kubernetes.io/projected/928c9b0d-6f27-4359-8b86-794d73ea9cd5-kube-api-access-nnmlv\") pod \"nova-cell1-10c1-account-create-update-qpxpb\" (UID: \"928c9b0d-6f27-4359-8b86-794d73ea9cd5\") " pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.480656 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.514651 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.828769 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.829171 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d659099f-7a05-4f1c-a097-67ecce42275d" containerName="glance-log" containerID="cri-o://5965fde4c7a481302c772ba776ff2a361688359ded9c8328331730d57024f30d" gracePeriod=30 Mar 19 17:01:33 crc kubenswrapper[4918]: I0319 17:01:33.829744 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d659099f-7a05-4f1c-a097-67ecce42275d" containerName="glance-httpd" containerID="cri-o://148a6aa0e62a3e04666a50eb5f97678441eb7e98de6cd8cd34b41fcf9e2708d4" gracePeriod=30 Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.133413 4918 generic.go:334] "Generic (PLEG): container finished" podID="d659099f-7a05-4f1c-a097-67ecce42275d" containerID="5965fde4c7a481302c772ba776ff2a361688359ded9c8328331730d57024f30d" exitCode=143 Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.133582 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d659099f-7a05-4f1c-a097-67ecce42275d","Type":"ContainerDied","Data":"5965fde4c7a481302c772ba776ff2a361688359ded9c8328331730d57024f30d"} Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.147232 4918 generic.go:334] "Generic (PLEG): container finished" podID="eee99f54-a76f-416d-a14f-cebf9d11548b" containerID="dc38a6c1fb8ac5da62997f1e892f5e7db87834e40181a874ebadb17fb75aa042" exitCode=0 Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.147277 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eee99f54-a76f-416d-a14f-cebf9d11548b","Type":"ContainerDied","Data":"dc38a6c1fb8ac5da62997f1e892f5e7db87834e40181a874ebadb17fb75aa042"} Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.533797 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.627509 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-config-data\") pod \"78d493df-532a-4203-9264-86e66bf964f0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.628596 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-log-httpd\") pod \"78d493df-532a-4203-9264-86e66bf964f0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.628701 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zqjc\" (UniqueName: \"kubernetes.io/projected/78d493df-532a-4203-9264-86e66bf964f0-kube-api-access-8zqjc\") pod \"78d493df-532a-4203-9264-86e66bf964f0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.629967 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-scripts\") pod \"78d493df-532a-4203-9264-86e66bf964f0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.630002 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-combined-ca-bundle\") pod \"78d493df-532a-4203-9264-86e66bf964f0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.630066 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-sg-core-conf-yaml\") pod \"78d493df-532a-4203-9264-86e66bf964f0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.630176 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-run-httpd\") pod \"78d493df-532a-4203-9264-86e66bf964f0\" (UID: \"78d493df-532a-4203-9264-86e66bf964f0\") " Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.631688 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "78d493df-532a-4203-9264-86e66bf964f0" (UID: "78d493df-532a-4203-9264-86e66bf964f0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.632595 4918 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.633028 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "78d493df-532a-4203-9264-86e66bf964f0" (UID: "78d493df-532a-4203-9264-86e66bf964f0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.649837 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d493df-532a-4203-9264-86e66bf964f0-kube-api-access-8zqjc" (OuterVolumeSpecName: "kube-api-access-8zqjc") pod "78d493df-532a-4203-9264-86e66bf964f0" (UID: "78d493df-532a-4203-9264-86e66bf964f0"). InnerVolumeSpecName "kube-api-access-8zqjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.655915 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-scripts" (OuterVolumeSpecName: "scripts") pod "78d493df-532a-4203-9264-86e66bf964f0" (UID: "78d493df-532a-4203-9264-86e66bf964f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.669074 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.669265 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1818f96e-6152-49a9-b6fc-726d7677112c" containerName="kube-state-metrics" containerID="cri-o://383cee5d641523407392efe99ff20a4a5996ffdb2fa9e7b2a15c5f4fa7771f84" gracePeriod=30 Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.684141 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "78d493df-532a-4203-9264-86e66bf964f0" (UID: "78d493df-532a-4203-9264-86e66bf964f0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.735388 4918 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78d493df-532a-4203-9264-86e66bf964f0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.735660 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zqjc\" (UniqueName: \"kubernetes.io/projected/78d493df-532a-4203-9264-86e66bf964f0-kube-api-access-8zqjc\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.735734 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.735792 4918 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.752935 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.759588 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-559c86bbbc-4zd54" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.793137 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78d493df-532a-4203-9264-86e66bf964f0" (UID: "78d493df-532a-4203-9264-86e66bf964f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.799076 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-config-data" (OuterVolumeSpecName: "config-data") pod "78d493df-532a-4203-9264-86e66bf964f0" (UID: "78d493df-532a-4203-9264-86e66bf964f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.840758 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:34 crc kubenswrapper[4918]: I0319 17:01:34.840785 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d493df-532a-4203-9264-86e66bf964f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.047762 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.049365 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.056830 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161243 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-config\") pod \"0825706f-8acf-485c-82dd-d5c672b187e8\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161311 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp6hd\" (UniqueName: \"kubernetes.io/projected/eee99f54-a76f-416d-a14f-cebf9d11548b-kube-api-access-bp6hd\") pod \"eee99f54-a76f-416d-a14f-cebf9d11548b\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161339 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-config-data\") pod \"eee99f54-a76f-416d-a14f-cebf9d11548b\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161366 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/178a9ae2-1774-4025-8951-93167e95f5d7-logs\") pod \"178a9ae2-1774-4025-8951-93167e95f5d7\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161404 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-ovndb-tls-certs\") pod \"0825706f-8acf-485c-82dd-d5c672b187e8\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161437 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-combined-ca-bundle\") pod \"0825706f-8acf-485c-82dd-d5c672b187e8\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161462 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-httpd-run\") pod \"eee99f54-a76f-416d-a14f-cebf9d11548b\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161488 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data-custom\") pod \"178a9ae2-1774-4025-8951-93167e95f5d7\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161540 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-combined-ca-bundle\") pod \"eee99f54-a76f-416d-a14f-cebf9d11548b\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161562 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-scripts\") pod \"eee99f54-a76f-416d-a14f-cebf9d11548b\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161599 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-httpd-config\") pod \"0825706f-8acf-485c-82dd-d5c672b187e8\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161623 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data\") pod \"178a9ae2-1774-4025-8951-93167e95f5d7\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161680 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-logs\") pod \"eee99f54-a76f-416d-a14f-cebf9d11548b\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161875 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"eee99f54-a76f-416d-a14f-cebf9d11548b\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161926 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-scripts\") pod \"178a9ae2-1774-4025-8951-93167e95f5d7\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.161989 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw4nt\" (UniqueName: \"kubernetes.io/projected/178a9ae2-1774-4025-8951-93167e95f5d7-kube-api-access-nw4nt\") pod \"178a9ae2-1774-4025-8951-93167e95f5d7\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.162028 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/178a9ae2-1774-4025-8951-93167e95f5d7-etc-machine-id\") pod \"178a9ae2-1774-4025-8951-93167e95f5d7\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.162059 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98vlc\" (UniqueName: \"kubernetes.io/projected/0825706f-8acf-485c-82dd-d5c672b187e8-kube-api-access-98vlc\") pod \"0825706f-8acf-485c-82dd-d5c672b187e8\" (UID: \"0825706f-8acf-485c-82dd-d5c672b187e8\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.162110 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-public-tls-certs\") pod \"eee99f54-a76f-416d-a14f-cebf9d11548b\" (UID: \"eee99f54-a76f-416d-a14f-cebf9d11548b\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.162149 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-combined-ca-bundle\") pod \"178a9ae2-1774-4025-8951-93167e95f5d7\" (UID: \"178a9ae2-1774-4025-8951-93167e95f5d7\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.169336 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/178a9ae2-1774-4025-8951-93167e95f5d7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "178a9ae2-1774-4025-8951-93167e95f5d7" (UID: "178a9ae2-1774-4025-8951-93167e95f5d7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.172142 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/178a9ae2-1774-4025-8951-93167e95f5d7-kube-api-access-nw4nt" (OuterVolumeSpecName: "kube-api-access-nw4nt") pod "178a9ae2-1774-4025-8951-93167e95f5d7" (UID: "178a9ae2-1774-4025-8951-93167e95f5d7"). InnerVolumeSpecName "kube-api-access-nw4nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.172823 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/178a9ae2-1774-4025-8951-93167e95f5d7-logs" (OuterVolumeSpecName: "logs") pod "178a9ae2-1774-4025-8951-93167e95f5d7" (UID: "178a9ae2-1774-4025-8951-93167e95f5d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.173512 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-logs" (OuterVolumeSpecName: "logs") pod "eee99f54-a76f-416d-a14f-cebf9d11548b" (UID: "eee99f54-a76f-416d-a14f-cebf9d11548b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.176872 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eee99f54-a76f-416d-a14f-cebf9d11548b" (UID: "eee99f54-a76f-416d-a14f-cebf9d11548b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.178262 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.178291 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw4nt\" (UniqueName: \"kubernetes.io/projected/178a9ae2-1774-4025-8951-93167e95f5d7-kube-api-access-nw4nt\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.178304 4918 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/178a9ae2-1774-4025-8951-93167e95f5d7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.178315 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/178a9ae2-1774-4025-8951-93167e95f5d7-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.178327 4918 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eee99f54-a76f-416d-a14f-cebf9d11548b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.178627 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-scripts" (OuterVolumeSpecName: "scripts") pod "178a9ae2-1774-4025-8951-93167e95f5d7" (UID: "178a9ae2-1774-4025-8951-93167e95f5d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.188543 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-scripts" (OuterVolumeSpecName: "scripts") pod "eee99f54-a76f-416d-a14f-cebf9d11548b" (UID: "eee99f54-a76f-416d-a14f-cebf9d11548b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.198128 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0825706f-8acf-485c-82dd-d5c672b187e8" (UID: "0825706f-8acf-485c-82dd-d5c672b187e8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.198184 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0825706f-8acf-485c-82dd-d5c672b187e8-kube-api-access-98vlc" (OuterVolumeSpecName: "kube-api-access-98vlc") pod "0825706f-8acf-485c-82dd-d5c672b187e8" (UID: "0825706f-8acf-485c-82dd-d5c672b187e8"). InnerVolumeSpecName "kube-api-access-98vlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.201066 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee99f54-a76f-416d-a14f-cebf9d11548b-kube-api-access-bp6hd" (OuterVolumeSpecName: "kube-api-access-bp6hd") pod "eee99f54-a76f-416d-a14f-cebf9d11548b" (UID: "eee99f54-a76f-416d-a14f-cebf9d11548b"). InnerVolumeSpecName "kube-api-access-bp6hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.209672 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb7fd774d-vnxdq" event={"ID":"0825706f-8acf-485c-82dd-d5c672b187e8","Type":"ContainerDied","Data":"344f07cdb1d9b0655b058619fc27a8847d3dc8e9ba7b6fcbb436d5e7cda0547a"} Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.209726 4918 scope.go:117] "RemoveContainer" containerID="048285cd1e307a5250686ca0ca9b49aea5077e3a65801edc8ea037c862608c52" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.209860 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb7fd774d-vnxdq" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.226778 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "178a9ae2-1774-4025-8951-93167e95f5d7" (UID: "178a9ae2-1774-4025-8951-93167e95f5d7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.244466 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22" (OuterVolumeSpecName: "glance") pod "eee99f54-a76f-416d-a14f-cebf9d11548b" (UID: "eee99f54-a76f-416d-a14f-cebf9d11548b"). InnerVolumeSpecName "pvc-d51f6764-e413-494d-9e7a-c5583dec6a22". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.245733 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eee99f54-a76f-416d-a14f-cebf9d11548b","Type":"ContainerDied","Data":"22255a7532df72df20a069895d64f76c18741bf8be18c6ae2872320386792391"} Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.245848 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.258263 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1eef-account-create-update-mz92v"] Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.280651 4918 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.280683 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.280694 4918 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.280717 4918 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") on node \"crc\" " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.280730 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.280740 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98vlc\" (UniqueName: \"kubernetes.io/projected/0825706f-8acf-485c-82dd-d5c672b187e8-kube-api-access-98vlc\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.280750 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp6hd\" (UniqueName: \"kubernetes.io/projected/eee99f54-a76f-416d-a14f-cebf9d11548b-kube-api-access-bp6hd\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.294054 4918 scope.go:117] "RemoveContainer" containerID="5a1e47eb79aaeb3b5c7b2d0aa6f667ef89dded02f15708716aab3f5c0d6bbb07" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.294290 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"789c957f-feb3-4f8c-83fa-3524740a2c8d","Type":"ContainerStarted","Data":"d2495a0131bcd90266a6fede5111464876ee80a94568137812123addb5679445"} Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.313275 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"178a9ae2-1774-4025-8951-93167e95f5d7","Type":"ContainerDied","Data":"8d4cd2bf2da9f24b9204065572dc70094665bd2d473d0926a3f0a3754b766cc6"} Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.313378 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.313715 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.420397086 podStartE2EDuration="18.313704243s" podCreationTimestamp="2026-03-19 17:01:17 +0000 UTC" firstStartedPulling="2026-03-19 17:01:18.280041689 +0000 UTC m=+1290.402240937" lastFinishedPulling="2026-03-19 17:01:34.173348856 +0000 UTC m=+1306.295548094" observedRunningTime="2026-03-19 17:01:35.3121234 +0000 UTC m=+1307.434322648" watchObservedRunningTime="2026-03-19 17:01:35.313704243 +0000 UTC m=+1307.435903491" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.320110 4918 generic.go:334] "Generic (PLEG): container finished" podID="1818f96e-6152-49a9-b6fc-726d7677112c" containerID="383cee5d641523407392efe99ff20a4a5996ffdb2fa9e7b2a15c5f4fa7771f84" exitCode=2 Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.320177 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1818f96e-6152-49a9-b6fc-726d7677112c","Type":"ContainerDied","Data":"383cee5d641523407392efe99ff20a4a5996ffdb2fa9e7b2a15c5f4fa7771f84"} Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.327777 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78d493df-532a-4203-9264-86e66bf964f0","Type":"ContainerDied","Data":"c70effcf5aac39751cda7b09a0403f7cee0a40998c7f6608e9eae8045c4df02b"} Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.328680 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.328674 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "178a9ae2-1774-4025-8951-93167e95f5d7" (UID: "178a9ae2-1774-4025-8951-93167e95f5d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.355798 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data" (OuterVolumeSpecName: "config-data") pod "178a9ae2-1774-4025-8951-93167e95f5d7" (UID: "178a9ae2-1774-4025-8951-93167e95f5d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.359210 4918 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.359367 4918 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d51f6764-e413-494d-9e7a-c5583dec6a22" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22") on node "crc" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.378144 4918 scope.go:117] "RemoveContainer" containerID="dc38a6c1fb8ac5da62997f1e892f5e7db87834e40181a874ebadb17fb75aa042" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.386099 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.386129 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/178a9ae2-1774-4025-8951-93167e95f5d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.386139 4918 reconciler_common.go:293] "Volume detached for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.395328 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.407813 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eee99f54-a76f-416d-a14f-cebf9d11548b" (UID: "eee99f54-a76f-416d-a14f-cebf9d11548b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.427970 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.429514 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-config" (OuterVolumeSpecName: "config") pod "0825706f-8acf-485c-82dd-d5c672b187e8" (UID: "0825706f-8acf-485c-82dd-d5c672b187e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.435682 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0825706f-8acf-485c-82dd-d5c672b187e8" (UID: "0825706f-8acf-485c-82dd-d5c672b187e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.445593 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:35 crc kubenswrapper[4918]: E0319 17:01:35.446132 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="ceilometer-central-agent" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446147 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="ceilometer-central-agent" Mar 19 17:01:35 crc kubenswrapper[4918]: E0319 17:01:35.446161 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178a9ae2-1774-4025-8951-93167e95f5d7" containerName="cinder-api-log" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446168 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="178a9ae2-1774-4025-8951-93167e95f5d7" containerName="cinder-api-log" Mar 19 17:01:35 crc kubenswrapper[4918]: E0319 17:01:35.446180 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0825706f-8acf-485c-82dd-d5c672b187e8" containerName="neutron-api" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446187 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0825706f-8acf-485c-82dd-d5c672b187e8" containerName="neutron-api" Mar 19 17:01:35 crc kubenswrapper[4918]: E0319 17:01:35.446200 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee99f54-a76f-416d-a14f-cebf9d11548b" containerName="glance-httpd" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446206 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee99f54-a76f-416d-a14f-cebf9d11548b" containerName="glance-httpd" Mar 19 17:01:35 crc kubenswrapper[4918]: E0319 17:01:35.446218 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0825706f-8acf-485c-82dd-d5c672b187e8" containerName="neutron-httpd" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446224 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0825706f-8acf-485c-82dd-d5c672b187e8" containerName="neutron-httpd" Mar 19 17:01:35 crc kubenswrapper[4918]: E0319 17:01:35.446238 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="proxy-httpd" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446244 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="proxy-httpd" Mar 19 17:01:35 crc kubenswrapper[4918]: E0319 17:01:35.446252 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178a9ae2-1774-4025-8951-93167e95f5d7" containerName="cinder-api" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446257 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="178a9ae2-1774-4025-8951-93167e95f5d7" containerName="cinder-api" Mar 19 17:01:35 crc kubenswrapper[4918]: E0319 17:01:35.446276 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee99f54-a76f-416d-a14f-cebf9d11548b" containerName="glance-log" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446282 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee99f54-a76f-416d-a14f-cebf9d11548b" containerName="glance-log" Mar 19 17:01:35 crc kubenswrapper[4918]: E0319 17:01:35.446295 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="ceilometer-notification-agent" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446301 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="ceilometer-notification-agent" Mar 19 17:01:35 crc kubenswrapper[4918]: E0319 17:01:35.446314 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="sg-core" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446330 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="sg-core" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446508 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="proxy-httpd" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446540 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="ceilometer-notification-agent" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446555 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee99f54-a76f-416d-a14f-cebf9d11548b" containerName="glance-httpd" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446574 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="178a9ae2-1774-4025-8951-93167e95f5d7" containerName="cinder-api-log" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446587 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="ceilometer-central-agent" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446601 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="0825706f-8acf-485c-82dd-d5c672b187e8" containerName="neutron-httpd" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446614 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee99f54-a76f-416d-a14f-cebf9d11548b" containerName="glance-log" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446626 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d493df-532a-4203-9264-86e66bf964f0" containerName="sg-core" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446637 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="0825706f-8acf-485c-82dd-d5c672b187e8" containerName="neutron-api" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.446650 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="178a9ae2-1774-4025-8951-93167e95f5d7" containerName="cinder-api" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.451951 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.454308 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.457542 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.457966 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.462427 4918 scope.go:117] "RemoveContainer" containerID="0c9edc6b8da1c8f4395f86c0760b11a1d0e50157fc2e91f2b5e58852148640c8" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.495869 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.495901 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.495914 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.502592 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eee99f54-a76f-416d-a14f-cebf9d11548b" (UID: "eee99f54-a76f-416d-a14f-cebf9d11548b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.509031 4918 scope.go:117] "RemoveContainer" containerID="d3cb418db6fdfe2f8e5fcd5661364f8f04b064e798a1a7006adf3e06922b7e9e" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.510629 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0825706f-8acf-485c-82dd-d5c672b187e8" (UID: "0825706f-8acf-485c-82dd-d5c672b187e8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.543286 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-config-data" (OuterVolumeSpecName: "config-data") pod "eee99f54-a76f-416d-a14f-cebf9d11548b" (UID: "eee99f54-a76f-416d-a14f-cebf9d11548b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.545335 4918 scope.go:117] "RemoveContainer" containerID="abfc046338e9ac7084231b22015e5bd9c6862373c4e995d47fd5e77fb3d0cf70" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.571241 4918 scope.go:117] "RemoveContainer" containerID="738ad7ddff566c1bcf4fae1c9adc62034094f6f32e93bf3022b7ccefa9f282b0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.592265 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.597994 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-run-httpd\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.598113 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-log-httpd\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.598132 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-scripts\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.598150 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-config-data\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.598207 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.598243 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.598266 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm5w5\" (UniqueName: \"kubernetes.io/projected/b7472445-a896-40fb-a6d8-0d893db4fa45-kube-api-access-wm5w5\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.598332 4918 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.598343 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eee99f54-a76f-416d-a14f-cebf9d11548b-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.598352 4918 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0825706f-8acf-485c-82dd-d5c672b187e8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.612559 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.643651 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.652889 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.660918 4918 scope.go:117] "RemoveContainer" containerID="09cf7a30a722680444026776007f6b7f843254948f2473f31360c0f51c8d7dc4" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.661281 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.661497 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.662052 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.665172 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.682053 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-10c1-account-create-update-qpxpb"] Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.696558 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6z7jz"] Mar 19 17:01:35 crc kubenswrapper[4918]: W0319 17:01:35.698803 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b2236d3_b352_4383_b099_0a1b39bdf222.slice/crio-3dda72ae2c0c3b8c316af3c49fbea4b0a45c0e2d4059a59b0a0cdcbf7742ee75 WatchSource:0}: Error finding container 3dda72ae2c0c3b8c316af3c49fbea4b0a45c0e2d4059a59b0a0cdcbf7742ee75: Status 404 returned error can't find the container with id 3dda72ae2c0c3b8c316af3c49fbea4b0a45c0e2d4059a59b0a0cdcbf7742ee75 Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.700044 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-log-httpd\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.700087 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-scripts\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.700110 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-config-data\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.700185 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.700236 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.700274 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm5w5\" (UniqueName: \"kubernetes.io/projected/b7472445-a896-40fb-a6d8-0d893db4fa45-kube-api-access-wm5w5\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.700328 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-run-httpd\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.701308 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-run-httpd\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.710403 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-log-httpd\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.722810 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-scripts\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.723676 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.724279 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.725095 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-config-data\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.741620 4918 scope.go:117] "RemoveContainer" containerID="ec9fe3e21f4df9bdfaa42364eed81825c8d6dc7a894fa20c56160687357261c2" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.743161 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm5w5\" (UniqueName: \"kubernetes.io/projected/b7472445-a896-40fb-a6d8-0d893db4fa45-kube-api-access-wm5w5\") pod \"ceilometer-0\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.757739 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.775773 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.804465 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.806544 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sndc\" (UniqueName: \"kubernetes.io/projected/1818f96e-6152-49a9-b6fc-726d7677112c-kube-api-access-7sndc\") pod \"1818f96e-6152-49a9-b6fc-726d7677112c\" (UID: \"1818f96e-6152-49a9-b6fc-726d7677112c\") " Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.806826 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-logs\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.806888 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-config-data\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.806907 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-scripts\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.806933 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.806955 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.806984 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.807002 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvc7z\" (UniqueName: \"kubernetes.io/projected/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-kube-api-access-jvc7z\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.807053 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.848740 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1818f96e-6152-49a9-b6fc-726d7677112c-kube-api-access-7sndc" (OuterVolumeSpecName: "kube-api-access-7sndc") pod "1818f96e-6152-49a9-b6fc-726d7677112c" (UID: "1818f96e-6152-49a9-b6fc-726d7677112c"). InnerVolumeSpecName "kube-api-access-7sndc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.870378 4918 scope.go:117] "RemoveContainer" containerID="d5c22a25ed50a8f736348edc2e40564c0f01803e56ab5de8d3a634c6669ce834" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.901663 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 17:01:35 crc kubenswrapper[4918]: E0319 17:01:35.902300 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1818f96e-6152-49a9-b6fc-726d7677112c" containerName="kube-state-metrics" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.902370 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="1818f96e-6152-49a9-b6fc-726d7677112c" containerName="kube-state-metrics" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.902656 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="1818f96e-6152-49a9-b6fc-726d7677112c" containerName="kube-state-metrics" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.903753 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.910793 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-logs\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.913187 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-config-data\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.913290 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-scripts\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.913396 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.913497 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.913629 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.913722 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvc7z\" (UniqueName: \"kubernetes.io/projected/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-kube-api-access-jvc7z\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.913891 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.914884 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sndc\" (UniqueName: \"kubernetes.io/projected/1818f96e-6152-49a9-b6fc-726d7677112c-kube-api-access-7sndc\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.911132 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-logs\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.914391 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.926547 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.926853 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.927634 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.943468 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.944391 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-config-data\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.948315 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-scripts\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.953903 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.971221 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvc7z\" (UniqueName: \"kubernetes.io/projected/fd884183-4c9b-4bc9-94f9-7fc63ddfd344-kube-api-access-jvc7z\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.986828 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 17:01:35 crc kubenswrapper[4918]: I0319 17:01:35.986865 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/342db21bb7c2e49b22a24134653a3d87a173d64abb89a0070323b0a8e0ff9956/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.028817 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.029551 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.029632 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-config-data\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.029702 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd48z\" (UniqueName: \"kubernetes.io/projected/de4932f1-1f41-4512-9dd6-408a095de14a-kube-api-access-nd48z\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.029793 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.029875 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-scripts\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.029958 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.030004 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de4932f1-1f41-4512-9dd6-408a095de14a-logs\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.030034 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-config-data-custom\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.030125 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de4932f1-1f41-4512-9dd6-408a095de14a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.102282 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fbfd-account-create-update-cvxhj"] Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.129985 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w4vzm"] Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.142810 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-config-data-custom\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.143083 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de4932f1-1f41-4512-9dd6-408a095de14a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.143206 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.143240 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-config-data\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.143283 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd48z\" (UniqueName: \"kubernetes.io/projected/de4932f1-1f41-4512-9dd6-408a095de14a-kube-api-access-nd48z\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.143400 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.143451 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-scripts\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.143468 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de4932f1-1f41-4512-9dd6-408a095de14a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.143600 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.143676 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de4932f1-1f41-4512-9dd6-408a095de14a-logs\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.144175 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de4932f1-1f41-4512-9dd6-408a095de14a-logs\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.150130 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-config-data-custom\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.152199 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-scripts\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.155707 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-s8jqb"] Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.157354 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.157362 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.158148 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.160662 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de4932f1-1f41-4512-9dd6-408a095de14a-config-data\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.163784 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd48z\" (UniqueName: \"kubernetes.io/projected/de4932f1-1f41-4512-9dd6-408a095de14a-kube-api-access-nd48z\") pod \"cinder-api-0\" (UID: \"de4932f1-1f41-4512-9dd6-408a095de14a\") " pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.200541 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d51f6764-e413-494d-9e7a-c5583dec6a22\") pod \"glance-default-external-api-0\" (UID: \"fd884183-4c9b-4bc9-94f9-7fc63ddfd344\") " pod="openstack/glance-default-external-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.214404 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.237625 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bb7fd774d-vnxdq"] Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.262768 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5bb7fd774d-vnxdq"] Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.324355 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.380932 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-s8jqb" event={"ID":"7c57e4da-677f-401e-b718-b25a5678a352","Type":"ContainerStarted","Data":"9a82cb9abd4fa9e92605645cd1069ad2b856b07b9ca0fc243fc5d298a2900a8d"} Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.385361 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" event={"ID":"928c9b0d-6f27-4359-8b86-794d73ea9cd5","Type":"ContainerStarted","Data":"801ff444204497c12ffc78f2cff6e95c81375619bfcdf7dc26556893b1c6e9d7"} Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.385403 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" event={"ID":"928c9b0d-6f27-4359-8b86-794d73ea9cd5","Type":"ContainerStarted","Data":"73ea54e1bd56e00aa95b455f95c39936adb3e565945232ff81e301c52085ab49"} Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.388785 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w4vzm" event={"ID":"8ceabd21-35af-489d-abc9-b4e8b629efd9","Type":"ContainerStarted","Data":"46f04d846235194b8896a243f98ad24bc189eb517d533755555d14c6ab505920"} Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.390780 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6z7jz" event={"ID":"4b2236d3-b352-4383-b099-0a1b39bdf222","Type":"ContainerStarted","Data":"3dda72ae2c0c3b8c316af3c49fbea4b0a45c0e2d4059a59b0a0cdcbf7742ee75"} Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.393965 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1eef-account-create-update-mz92v" event={"ID":"d53b74e8-6505-42fc-bbb6-9f9f2b96f747","Type":"ContainerStarted","Data":"51c2aee85b6d862aaae79e36e07c279709e0c6b653105aa85e70feda2a48f452"} Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.393998 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1eef-account-create-update-mz92v" event={"ID":"d53b74e8-6505-42fc-bbb6-9f9f2b96f747","Type":"ContainerStarted","Data":"f1d9694828c08fb860a80a9c9ec173ecca48897ac70a8741c5e1039230231c1a"} Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.395237 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" event={"ID":"00939ae6-93f7-437d-904c-53eaf4c4fc52","Type":"ContainerStarted","Data":"9685cfc9499874d004d0f9449e960f8c308a8e5f774999fa93f9324b99c700ab"} Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.404641 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" podStartSLOduration=3.404623766 podStartE2EDuration="3.404623766s" podCreationTimestamp="2026-03-19 17:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:36.398350545 +0000 UTC m=+1308.520549793" watchObservedRunningTime="2026-03-19 17:01:36.404623766 +0000 UTC m=+1308.526823014" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.406033 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1818f96e-6152-49a9-b6fc-726d7677112c","Type":"ContainerDied","Data":"deb37330da6ea86c05534aee9ad5c8a356cebcef93fed0788365e7a6f2d30a5a"} Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.406091 4918 scope.go:117] "RemoveContainer" containerID="383cee5d641523407392efe99ff20a4a5996ffdb2fa9e7b2a15c5f4fa7771f84" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.405891 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.427911 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-1eef-account-create-update-mz92v" podStartSLOduration=4.427894414 podStartE2EDuration="4.427894414s" podCreationTimestamp="2026-03-19 17:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:36.411290569 +0000 UTC m=+1308.533489817" watchObservedRunningTime="2026-03-19 17:01:36.427894414 +0000 UTC m=+1308.550093662" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.437241 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-6z7jz" podStartSLOduration=4.4372245 podStartE2EDuration="4.4372245s" podCreationTimestamp="2026-03-19 17:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:36.435538144 +0000 UTC m=+1308.557737412" watchObservedRunningTime="2026-03-19 17:01:36.4372245 +0000 UTC m=+1308.559423748" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.521077 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.647254 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0825706f-8acf-485c-82dd-d5c672b187e8" path="/var/lib/kubelet/pods/0825706f-8acf-485c-82dd-d5c672b187e8/volumes" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.651237 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="178a9ae2-1774-4025-8951-93167e95f5d7" path="/var/lib/kubelet/pods/178a9ae2-1774-4025-8951-93167e95f5d7/volumes" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.652498 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d493df-532a-4203-9264-86e66bf964f0" path="/var/lib/kubelet/pods/78d493df-532a-4203-9264-86e66bf964f0/volumes" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.654717 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee99f54-a76f-416d-a14f-cebf9d11548b" path="/var/lib/kubelet/pods/eee99f54-a76f-416d-a14f-cebf9d11548b/volumes" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.658326 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.658829 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.665099 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.676569 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.676617 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.676741 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.690048 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.717782 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7wnz\" (UniqueName: \"kubernetes.io/projected/e77f7369-22cd-4f5b-afb8-132004eb811f-kube-api-access-h7wnz\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.717912 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77f7369-22cd-4f5b-afb8-132004eb811f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.718029 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e77f7369-22cd-4f5b-afb8-132004eb811f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.718083 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e77f7369-22cd-4f5b-afb8-132004eb811f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.820307 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7wnz\" (UniqueName: \"kubernetes.io/projected/e77f7369-22cd-4f5b-afb8-132004eb811f-kube-api-access-h7wnz\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.820419 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77f7369-22cd-4f5b-afb8-132004eb811f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.820541 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e77f7369-22cd-4f5b-afb8-132004eb811f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.820590 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e77f7369-22cd-4f5b-afb8-132004eb811f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.831279 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77f7369-22cd-4f5b-afb8-132004eb811f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.831698 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e77f7369-22cd-4f5b-afb8-132004eb811f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.832748 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e77f7369-22cd-4f5b-afb8-132004eb811f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.845133 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7wnz\" (UniqueName: \"kubernetes.io/projected/e77f7369-22cd-4f5b-afb8-132004eb811f-kube-api-access-h7wnz\") pod \"kube-state-metrics-0\" (UID: \"e77f7369-22cd-4f5b-afb8-132004eb811f\") " pod="openstack/kube-state-metrics-0" Mar 19 17:01:36 crc kubenswrapper[4918]: I0319 17:01:36.956763 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:36.999367 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.015132 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="d659099f-7a05-4f1c-a097-67ecce42275d" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.172:9292/healthcheck\": read tcp 10.217.0.2:50498->10.217.0.172:9292: read: connection reset by peer" Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.015459 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="d659099f-7a05-4f1c-a097-67ecce42275d" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.172:9292/healthcheck\": read tcp 10.217.0.2:50496->10.217.0.172:9292: read: connection reset by peer" Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.306117 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.447127 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6z7jz" event={"ID":"4b2236d3-b352-4383-b099-0a1b39bdf222","Type":"ContainerDied","Data":"e4771b68ab4d8353e4b84cea1cc82cbe3b9586758f22f67fe88f0bdd1cf0ec3a"} Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.447259 4918 generic.go:334] "Generic (PLEG): container finished" podID="4b2236d3-b352-4383-b099-0a1b39bdf222" containerID="e4771b68ab4d8353e4b84cea1cc82cbe3b9586758f22f67fe88f0bdd1cf0ec3a" exitCode=0 Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.456430 4918 generic.go:334] "Generic (PLEG): container finished" podID="d53b74e8-6505-42fc-bbb6-9f9f2b96f747" containerID="51c2aee85b6d862aaae79e36e07c279709e0c6b653105aa85e70feda2a48f452" exitCode=0 Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.456485 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1eef-account-create-update-mz92v" event={"ID":"d53b74e8-6505-42fc-bbb6-9f9f2b96f747","Type":"ContainerDied","Data":"51c2aee85b6d862aaae79e36e07c279709e0c6b653105aa85e70feda2a48f452"} Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.459277 4918 generic.go:334] "Generic (PLEG): container finished" podID="7c57e4da-677f-401e-b718-b25a5678a352" containerID="a7521016cee0928b0ce0ac4ce84e5e555a57beb6493320858df45f9ff3dad306" exitCode=0 Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.459314 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-s8jqb" event={"ID":"7c57e4da-677f-401e-b718-b25a5678a352","Type":"ContainerDied","Data":"a7521016cee0928b0ce0ac4ce84e5e555a57beb6493320858df45f9ff3dad306"} Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.464499 4918 generic.go:334] "Generic (PLEG): container finished" podID="928c9b0d-6f27-4359-8b86-794d73ea9cd5" containerID="801ff444204497c12ffc78f2cff6e95c81375619bfcdf7dc26556893b1c6e9d7" exitCode=0 Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.464567 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" event={"ID":"928c9b0d-6f27-4359-8b86-794d73ea9cd5","Type":"ContainerDied","Data":"801ff444204497c12ffc78f2cff6e95c81375619bfcdf7dc26556893b1c6e9d7"} Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.466269 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fd884183-4c9b-4bc9-94f9-7fc63ddfd344","Type":"ContainerStarted","Data":"240d0246c8110a97437774dfcecedec95eda3ba854ed515bfb69266f0c223e6d"} Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.467993 4918 generic.go:334] "Generic (PLEG): container finished" podID="d659099f-7a05-4f1c-a097-67ecce42275d" containerID="148a6aa0e62a3e04666a50eb5f97678441eb7e98de6cd8cd34b41fcf9e2708d4" exitCode=0 Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.468036 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d659099f-7a05-4f1c-a097-67ecce42275d","Type":"ContainerDied","Data":"148a6aa0e62a3e04666a50eb5f97678441eb7e98de6cd8cd34b41fcf9e2708d4"} Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.470611 4918 generic.go:334] "Generic (PLEG): container finished" podID="00939ae6-93f7-437d-904c-53eaf4c4fc52" containerID="1c72f339924db22c3d72d056f549194307988b495fdcb9c1a06ef81b0110fb1a" exitCode=0 Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.470652 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" event={"ID":"00939ae6-93f7-437d-904c-53eaf4c4fc52","Type":"ContainerDied","Data":"1c72f339924db22c3d72d056f549194307988b495fdcb9c1a06ef81b0110fb1a"} Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.475585 4918 generic.go:334] "Generic (PLEG): container finished" podID="8ceabd21-35af-489d-abc9-b4e8b629efd9" containerID="7fb0fce364b7c9a07ce7ecdf71bf4b623d657462eed1ef765c0cbbd82ce919aa" exitCode=0 Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.475623 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w4vzm" event={"ID":"8ceabd21-35af-489d-abc9-b4e8b629efd9","Type":"ContainerDied","Data":"7fb0fce364b7c9a07ce7ecdf71bf4b623d657462eed1ef765c0cbbd82ce919aa"} Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.486537 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de4932f1-1f41-4512-9dd6-408a095de14a","Type":"ContainerStarted","Data":"8ccb4c27b16ac083d856f5f9b56544e1babb6186b75b4511c258e320a292e268"} Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.491995 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7472445-a896-40fb-a6d8-0d893db4fa45","Type":"ContainerStarted","Data":"65d7b2e63aac08e542f5c7ade9142bc95a93792f6e0d59407d8c7b7c487a2a8b"} Mar 19 17:01:37 crc kubenswrapper[4918]: I0319 17:01:37.622329 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 17:01:37 crc kubenswrapper[4918]: W0319 17:01:37.631547 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode77f7369_22cd_4f5b_afb8_132004eb811f.slice/crio-aab0d52623cb65b0539a0557ea4f820471f38fe9827837b6bc395347f8d227e0 WatchSource:0}: Error finding container aab0d52623cb65b0539a0557ea4f820471f38fe9827837b6bc395347f8d227e0: Status 404 returned error can't find the container with id aab0d52623cb65b0539a0557ea4f820471f38fe9827837b6bc395347f8d227e0 Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.506258 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e77f7369-22cd-4f5b-afb8-132004eb811f","Type":"ContainerStarted","Data":"aab0d52623cb65b0539a0557ea4f820471f38fe9827837b6bc395347f8d227e0"} Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.508771 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7472445-a896-40fb-a6d8-0d893db4fa45","Type":"ContainerStarted","Data":"33c485d46e50ab0bce61ef730095f2ec08b2372c31a845190fd32711819d7d36"} Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.508817 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7472445-a896-40fb-a6d8-0d893db4fa45","Type":"ContainerStarted","Data":"a10773354733d273395fc50b877f5033db7d1836ca7e94d73f60b043be2330ce"} Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.510664 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fd884183-4c9b-4bc9-94f9-7fc63ddfd344","Type":"ContainerStarted","Data":"5c13b9dd2a4970775e8e38919cc98a2c6a664ac02003d1d4ad45ff1922c40805"} Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.529085 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de4932f1-1f41-4512-9dd6-408a095de14a","Type":"ContainerStarted","Data":"c1c296485547e78d3435ed79a0af2bd2e46e32304592cd3728a4307cdae16f8b"} Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.622672 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1818f96e-6152-49a9-b6fc-726d7677112c" path="/var/lib/kubelet/pods/1818f96e-6152-49a9-b6fc-726d7677112c/volumes" Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.647998 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.780295 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-combined-ca-bundle\") pod \"d659099f-7a05-4f1c-a097-67ecce42275d\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.780375 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-config-data\") pod \"d659099f-7a05-4f1c-a097-67ecce42275d\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.780411 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4m9n\" (UniqueName: \"kubernetes.io/projected/d659099f-7a05-4f1c-a097-67ecce42275d-kube-api-access-r4m9n\") pod \"d659099f-7a05-4f1c-a097-67ecce42275d\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.780500 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-internal-tls-certs\") pod \"d659099f-7a05-4f1c-a097-67ecce42275d\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.780534 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-scripts\") pod \"d659099f-7a05-4f1c-a097-67ecce42275d\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.780613 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-httpd-run\") pod \"d659099f-7a05-4f1c-a097-67ecce42275d\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.780652 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-logs\") pod \"d659099f-7a05-4f1c-a097-67ecce42275d\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.781011 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"d659099f-7a05-4f1c-a097-67ecce42275d\" (UID: \"d659099f-7a05-4f1c-a097-67ecce42275d\") " Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.788919 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-logs" (OuterVolumeSpecName: "logs") pod "d659099f-7a05-4f1c-a097-67ecce42275d" (UID: "d659099f-7a05-4f1c-a097-67ecce42275d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.789801 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d659099f-7a05-4f1c-a097-67ecce42275d" (UID: "d659099f-7a05-4f1c-a097-67ecce42275d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.791860 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d659099f-7a05-4f1c-a097-67ecce42275d-kube-api-access-r4m9n" (OuterVolumeSpecName: "kube-api-access-r4m9n") pod "d659099f-7a05-4f1c-a097-67ecce42275d" (UID: "d659099f-7a05-4f1c-a097-67ecce42275d"). InnerVolumeSpecName "kube-api-access-r4m9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.793445 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-scripts" (OuterVolumeSpecName: "scripts") pod "d659099f-7a05-4f1c-a097-67ecce42275d" (UID: "d659099f-7a05-4f1c-a097-67ecce42275d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.887123 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4m9n\" (UniqueName: \"kubernetes.io/projected/d659099f-7a05-4f1c-a097-67ecce42275d-kube-api-access-r4m9n\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.887157 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.887195 4918 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.887207 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d659099f-7a05-4f1c-a097-67ecce42275d-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.900678 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37" (OuterVolumeSpecName: "glance") pod "d659099f-7a05-4f1c-a097-67ecce42275d" (UID: "d659099f-7a05-4f1c-a097-67ecce42275d"). InnerVolumeSpecName "pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.920622 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:38 crc kubenswrapper[4918]: I0319 17:01:38.990392 4918 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") on node \"crc\" " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.121770 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d659099f-7a05-4f1c-a097-67ecce42275d" (UID: "d659099f-7a05-4f1c-a097-67ecce42275d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.124617 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-config-data" (OuterVolumeSpecName: "config-data") pod "d659099f-7a05-4f1c-a097-67ecce42275d" (UID: "d659099f-7a05-4f1c-a097-67ecce42275d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.150120 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d659099f-7a05-4f1c-a097-67ecce42275d" (UID: "d659099f-7a05-4f1c-a097-67ecce42275d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.154891 4918 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.155044 4918 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37") on node "crc" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.199324 4918 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.199372 4918 reconciler_common.go:293] "Volume detached for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.199387 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.199400 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d659099f-7a05-4f1c-a097-67ecce42275d-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.410124 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.503694 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00939ae6-93f7-437d-904c-53eaf4c4fc52-operator-scripts\") pod \"00939ae6-93f7-437d-904c-53eaf4c4fc52\" (UID: \"00939ae6-93f7-437d-904c-53eaf4c4fc52\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.504621 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmj6d\" (UniqueName: \"kubernetes.io/projected/00939ae6-93f7-437d-904c-53eaf4c4fc52-kube-api-access-zmj6d\") pod \"00939ae6-93f7-437d-904c-53eaf4c4fc52\" (UID: \"00939ae6-93f7-437d-904c-53eaf4c4fc52\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.504672 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00939ae6-93f7-437d-904c-53eaf4c4fc52-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "00939ae6-93f7-437d-904c-53eaf4c4fc52" (UID: "00939ae6-93f7-437d-904c-53eaf4c4fc52"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.505914 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w4vzm" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.510284 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00939ae6-93f7-437d-904c-53eaf4c4fc52-kube-api-access-zmj6d" (OuterVolumeSpecName: "kube-api-access-zmj6d") pod "00939ae6-93f7-437d-904c-53eaf4c4fc52" (UID: "00939ae6-93f7-437d-904c-53eaf4c4fc52"). InnerVolumeSpecName "kube-api-access-zmj6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.560476 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6z7jz" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.562823 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w4vzm" event={"ID":"8ceabd21-35af-489d-abc9-b4e8b629efd9","Type":"ContainerDied","Data":"46f04d846235194b8896a243f98ad24bc189eb517d533755555d14c6ab505920"} Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.562865 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46f04d846235194b8896a243f98ad24bc189eb517d533755555d14c6ab505920" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.562919 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w4vzm" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.568859 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d659099f-7a05-4f1c-a097-67ecce42275d","Type":"ContainerDied","Data":"712e807e0e7f36a6a3eaa4df041bd324bcdf655090e173964e51bd72ddcbee0d"} Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.568902 4918 scope.go:117] "RemoveContainer" containerID="148a6aa0e62a3e04666a50eb5f97678441eb7e98de6cd8cd34b41fcf9e2708d4" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.569098 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.583396 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" event={"ID":"00939ae6-93f7-437d-904c-53eaf4c4fc52","Type":"ContainerDied","Data":"9685cfc9499874d004d0f9449e960f8c308a8e5f774999fa93f9324b99c700ab"} Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.583504 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9685cfc9499874d004d0f9449e960f8c308a8e5f774999fa93f9324b99c700ab" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.583641 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fbfd-account-create-update-cvxhj" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.587489 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1eef-account-create-update-mz92v" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.603623 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e77f7369-22cd-4f5b-afb8-132004eb811f","Type":"ContainerStarted","Data":"5f5b17e9a9575e6669eb95bc3bcff7441961466cbbe5c09c0814b87aa8f0d1cc"} Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.607316 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.607915 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm4wf\" (UniqueName: \"kubernetes.io/projected/8ceabd21-35af-489d-abc9-b4e8b629efd9-kube-api-access-lm4wf\") pod \"8ceabd21-35af-489d-abc9-b4e8b629efd9\" (UID: \"8ceabd21-35af-489d-abc9-b4e8b629efd9\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.608017 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b2236d3-b352-4383-b099-0a1b39bdf222-operator-scripts\") pod \"4b2236d3-b352-4383-b099-0a1b39bdf222\" (UID: \"4b2236d3-b352-4383-b099-0a1b39bdf222\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.608200 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ceabd21-35af-489d-abc9-b4e8b629efd9-operator-scripts\") pod \"8ceabd21-35af-489d-abc9-b4e8b629efd9\" (UID: \"8ceabd21-35af-489d-abc9-b4e8b629efd9\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.609046 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xqmg\" (UniqueName: \"kubernetes.io/projected/4b2236d3-b352-4383-b099-0a1b39bdf222-kube-api-access-8xqmg\") pod \"4b2236d3-b352-4383-b099-0a1b39bdf222\" (UID: \"4b2236d3-b352-4383-b099-0a1b39bdf222\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.610609 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2236d3-b352-4383-b099-0a1b39bdf222-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b2236d3-b352-4383-b099-0a1b39bdf222" (UID: "4b2236d3-b352-4383-b099-0a1b39bdf222"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.611036 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.611578 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ceabd21-35af-489d-abc9-b4e8b629efd9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ceabd21-35af-489d-abc9-b4e8b629efd9" (UID: "8ceabd21-35af-489d-abc9-b4e8b629efd9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.612855 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b2236d3-b352-4383-b099-0a1b39bdf222-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.612902 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00939ae6-93f7-437d-904c-53eaf4c4fc52-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.612926 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ceabd21-35af-489d-abc9-b4e8b629efd9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.612940 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmj6d\" (UniqueName: \"kubernetes.io/projected/00939ae6-93f7-437d-904c-53eaf4c4fc52-kube-api-access-zmj6d\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.617801 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ceabd21-35af-489d-abc9-b4e8b629efd9-kube-api-access-lm4wf" (OuterVolumeSpecName: "kube-api-access-lm4wf") pod "8ceabd21-35af-489d-abc9-b4e8b629efd9" (UID: "8ceabd21-35af-489d-abc9-b4e8b629efd9"). InnerVolumeSpecName "kube-api-access-lm4wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.625870 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2236d3-b352-4383-b099-0a1b39bdf222-kube-api-access-8xqmg" (OuterVolumeSpecName: "kube-api-access-8xqmg") pod "4b2236d3-b352-4383-b099-0a1b39bdf222" (UID: "4b2236d3-b352-4383-b099-0a1b39bdf222"). InnerVolumeSpecName "kube-api-access-8xqmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.647736 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-s8jqb" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.673863 4918 scope.go:117] "RemoveContainer" containerID="5965fde4c7a481302c772ba776ff2a361688359ded9c8328331730d57024f30d" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.696622 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.89302507 podStartE2EDuration="3.696603702s" podCreationTimestamp="2026-03-19 17:01:36 +0000 UTC" firstStartedPulling="2026-03-19 17:01:37.633490248 +0000 UTC m=+1309.755689496" lastFinishedPulling="2026-03-19 17:01:38.43706888 +0000 UTC m=+1310.559268128" observedRunningTime="2026-03-19 17:01:39.62426101 +0000 UTC m=+1311.746460278" watchObservedRunningTime="2026-03-19 17:01:39.696603702 +0000 UTC m=+1311.818802940" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.714687 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-operator-scripts\") pod \"d53b74e8-6505-42fc-bbb6-9f9f2b96f747\" (UID: \"d53b74e8-6505-42fc-bbb6-9f9f2b96f747\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.714783 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7pjs\" (UniqueName: \"kubernetes.io/projected/7c57e4da-677f-401e-b718-b25a5678a352-kube-api-access-l7pjs\") pod \"7c57e4da-677f-401e-b718-b25a5678a352\" (UID: \"7c57e4da-677f-401e-b718-b25a5678a352\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.714817 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c57e4da-677f-401e-b718-b25a5678a352-operator-scripts\") pod \"7c57e4da-677f-401e-b718-b25a5678a352\" (UID: \"7c57e4da-677f-401e-b718-b25a5678a352\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.714901 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmcvm\" (UniqueName: \"kubernetes.io/projected/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-kube-api-access-kmcvm\") pod \"d53b74e8-6505-42fc-bbb6-9f9f2b96f747\" (UID: \"d53b74e8-6505-42fc-bbb6-9f9f2b96f747\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.714930 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/928c9b0d-6f27-4359-8b86-794d73ea9cd5-operator-scripts\") pod \"928c9b0d-6f27-4359-8b86-794d73ea9cd5\" (UID: \"928c9b0d-6f27-4359-8b86-794d73ea9cd5\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.715073 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnmlv\" (UniqueName: \"kubernetes.io/projected/928c9b0d-6f27-4359-8b86-794d73ea9cd5-kube-api-access-nnmlv\") pod \"928c9b0d-6f27-4359-8b86-794d73ea9cd5\" (UID: \"928c9b0d-6f27-4359-8b86-794d73ea9cd5\") " Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.715763 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xqmg\" (UniqueName: \"kubernetes.io/projected/4b2236d3-b352-4383-b099-0a1b39bdf222-kube-api-access-8xqmg\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.715787 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm4wf\" (UniqueName: \"kubernetes.io/projected/8ceabd21-35af-489d-abc9-b4e8b629efd9-kube-api-access-lm4wf\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.716754 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d53b74e8-6505-42fc-bbb6-9f9f2b96f747" (UID: "d53b74e8-6505-42fc-bbb6-9f9f2b96f747"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.717946 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c57e4da-677f-401e-b718-b25a5678a352-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c57e4da-677f-401e-b718-b25a5678a352" (UID: "7c57e4da-677f-401e-b718-b25a5678a352"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.719439 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928c9b0d-6f27-4359-8b86-794d73ea9cd5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "928c9b0d-6f27-4359-8b86-794d73ea9cd5" (UID: "928c9b0d-6f27-4359-8b86-794d73ea9cd5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.721876 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c57e4da-677f-401e-b718-b25a5678a352-kube-api-access-l7pjs" (OuterVolumeSpecName: "kube-api-access-l7pjs") pod "7c57e4da-677f-401e-b718-b25a5678a352" (UID: "7c57e4da-677f-401e-b718-b25a5678a352"). InnerVolumeSpecName "kube-api-access-l7pjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.724537 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-kube-api-access-kmcvm" (OuterVolumeSpecName: "kube-api-access-kmcvm") pod "d53b74e8-6505-42fc-bbb6-9f9f2b96f747" (UID: "d53b74e8-6505-42fc-bbb6-9f9f2b96f747"). InnerVolumeSpecName "kube-api-access-kmcvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.724614 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.733798 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.741718 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928c9b0d-6f27-4359-8b86-794d73ea9cd5-kube-api-access-nnmlv" (OuterVolumeSpecName: "kube-api-access-nnmlv") pod "928c9b0d-6f27-4359-8b86-794d73ea9cd5" (UID: "928c9b0d-6f27-4359-8b86-794d73ea9cd5"). InnerVolumeSpecName "kube-api-access-nnmlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.743064 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 17:01:39 crc kubenswrapper[4918]: E0319 17:01:39.743629 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c57e4da-677f-401e-b718-b25a5678a352" containerName="mariadb-database-create" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.743648 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c57e4da-677f-401e-b718-b25a5678a352" containerName="mariadb-database-create" Mar 19 17:01:39 crc kubenswrapper[4918]: E0319 17:01:39.743685 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ceabd21-35af-489d-abc9-b4e8b629efd9" containerName="mariadb-database-create" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.743692 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ceabd21-35af-489d-abc9-b4e8b629efd9" containerName="mariadb-database-create" Mar 19 17:01:39 crc kubenswrapper[4918]: E0319 17:01:39.743705 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d659099f-7a05-4f1c-a097-67ecce42275d" containerName="glance-log" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.743711 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d659099f-7a05-4f1c-a097-67ecce42275d" containerName="glance-log" Mar 19 17:01:39 crc kubenswrapper[4918]: E0319 17:01:39.743723 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d659099f-7a05-4f1c-a097-67ecce42275d" containerName="glance-httpd" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.743729 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d659099f-7a05-4f1c-a097-67ecce42275d" containerName="glance-httpd" Mar 19 17:01:39 crc kubenswrapper[4918]: E0319 17:01:39.743750 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00939ae6-93f7-437d-904c-53eaf4c4fc52" containerName="mariadb-account-create-update" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.743756 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="00939ae6-93f7-437d-904c-53eaf4c4fc52" containerName="mariadb-account-create-update" Mar 19 17:01:39 crc kubenswrapper[4918]: E0319 17:01:39.743770 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2236d3-b352-4383-b099-0a1b39bdf222" containerName="mariadb-database-create" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.743776 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2236d3-b352-4383-b099-0a1b39bdf222" containerName="mariadb-database-create" Mar 19 17:01:39 crc kubenswrapper[4918]: E0319 17:01:39.743785 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928c9b0d-6f27-4359-8b86-794d73ea9cd5" containerName="mariadb-account-create-update" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.743791 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="928c9b0d-6f27-4359-8b86-794d73ea9cd5" containerName="mariadb-account-create-update" Mar 19 17:01:39 crc kubenswrapper[4918]: E0319 17:01:39.743802 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53b74e8-6505-42fc-bbb6-9f9f2b96f747" containerName="mariadb-account-create-update" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.743807 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53b74e8-6505-42fc-bbb6-9f9f2b96f747" containerName="mariadb-account-create-update" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.743993 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d659099f-7a05-4f1c-a097-67ecce42275d" containerName="glance-log" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.744007 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d659099f-7a05-4f1c-a097-67ecce42275d" containerName="glance-httpd" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.744020 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="928c9b0d-6f27-4359-8b86-794d73ea9cd5" containerName="mariadb-account-create-update" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.744030 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2236d3-b352-4383-b099-0a1b39bdf222" containerName="mariadb-database-create" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.744039 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c57e4da-677f-401e-b718-b25a5678a352" containerName="mariadb-database-create" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.744051 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ceabd21-35af-489d-abc9-b4e8b629efd9" containerName="mariadb-database-create" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.744062 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="00939ae6-93f7-437d-904c-53eaf4c4fc52" containerName="mariadb-account-create-update" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.744071 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d53b74e8-6505-42fc-bbb6-9f9f2b96f747" containerName="mariadb-account-create-update" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.745252 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.747263 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.747514 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.763707 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.817858 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818108 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818170 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818203 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818337 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-logs\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818396 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7n7\" (UniqueName: \"kubernetes.io/projected/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-kube-api-access-jb7n7\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818431 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818593 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818714 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnmlv\" (UniqueName: \"kubernetes.io/projected/928c9b0d-6f27-4359-8b86-794d73ea9cd5-kube-api-access-nnmlv\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818734 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818745 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7pjs\" (UniqueName: \"kubernetes.io/projected/7c57e4da-677f-401e-b718-b25a5678a352-kube-api-access-l7pjs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818755 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c57e4da-677f-401e-b718-b25a5678a352-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818763 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmcvm\" (UniqueName: \"kubernetes.io/projected/d53b74e8-6505-42fc-bbb6-9f9f2b96f747-kube-api-access-kmcvm\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.818772 4918 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/928c9b0d-6f27-4359-8b86-794d73ea9cd5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.920051 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.920146 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.920181 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.920210 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.920254 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-logs\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.920297 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7n7\" (UniqueName: \"kubernetes.io/projected/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-kube-api-access-jb7n7\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.920334 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.920414 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.921351 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-logs\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.921430 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.925730 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.929794 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.931121 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.931150 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ea238a48943c00d3b8fe8315d317d6aa508a60b77f6685b492c061941b28c63f/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.938009 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb7n7\" (UniqueName: \"kubernetes.io/projected/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-kube-api-access-jb7n7\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.939826 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.944800 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69dd59cf-aece-4caa-a2d1-85dfcc8c3306-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:39 crc kubenswrapper[4918]: I0319 17:01:39.991877 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ead6e092-0eed-4491-8a96-ecfdc1ad9a37\") pod \"glance-default-internal-api-0\" (UID: \"69dd59cf-aece-4caa-a2d1-85dfcc8c3306\") " pod="openstack/glance-default-internal-api-0" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.072615 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.626226 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d659099f-7a05-4f1c-a097-67ecce42275d" path="/var/lib/kubelet/pods/d659099f-7a05-4f1c-a097-67ecce42275d/volumes" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.662867 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7472445-a896-40fb-a6d8-0d893db4fa45","Type":"ContainerStarted","Data":"4ee8218935f2144e465e82b29ec12c0224b2b3eeab6dee074dba41c0ac1416fc"} Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.670434 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1eef-account-create-update-mz92v" event={"ID":"d53b74e8-6505-42fc-bbb6-9f9f2b96f747","Type":"ContainerDied","Data":"f1d9694828c08fb860a80a9c9ec173ecca48897ac70a8741c5e1039230231c1a"} Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.670485 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1d9694828c08fb860a80a9c9ec173ecca48897ac70a8741c5e1039230231c1a" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.673795 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-s8jqb" event={"ID":"7c57e4da-677f-401e-b718-b25a5678a352","Type":"ContainerDied","Data":"9a82cb9abd4fa9e92605645cd1069ad2b856b07b9ca0fc243fc5d298a2900a8d"} Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.673825 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a82cb9abd4fa9e92605645cd1069ad2b856b07b9ca0fc243fc5d298a2900a8d" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.673907 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-s8jqb" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.670977 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1eef-account-create-update-mz92v" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.679171 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" event={"ID":"928c9b0d-6f27-4359-8b86-794d73ea9cd5","Type":"ContainerDied","Data":"73ea54e1bd56e00aa95b455f95c39936adb3e565945232ff81e301c52085ab49"} Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.679225 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ea54e1bd56e00aa95b455f95c39936adb3e565945232ff81e301c52085ab49" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.679315 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-10c1-account-create-update-qpxpb" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.690043 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fd884183-4c9b-4bc9-94f9-7fc63ddfd344","Type":"ContainerStarted","Data":"b90208d46a2303bcbfc41a59df066d1bdfa12b65a9ac373cc753b523259ecf8b"} Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.696240 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de4932f1-1f41-4512-9dd6-408a095de14a","Type":"ContainerStarted","Data":"9f449b92d2d766df490fe3bc6a21cb677a3c98457b186c58e6fea5fd9680835d"} Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.697558 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.704756 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6z7jz" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.704793 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6z7jz" event={"ID":"4b2236d3-b352-4383-b099-0a1b39bdf222","Type":"ContainerDied","Data":"3dda72ae2c0c3b8c316af3c49fbea4b0a45c0e2d4059a59b0a0cdcbf7742ee75"} Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.704822 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dda72ae2c0c3b8c316af3c49fbea4b0a45c0e2d4059a59b0a0cdcbf7742ee75" Mar 19 17:01:40 crc kubenswrapper[4918]: W0319 17:01:40.706646 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69dd59cf_aece_4caa_a2d1_85dfcc8c3306.slice/crio-030df2e9cb3ae9427cc0d273e8ce28b545d43d2c885c176654089a906e59a53b WatchSource:0}: Error finding container 030df2e9cb3ae9427cc0d273e8ce28b545d43d2c885c176654089a906e59a53b: Status 404 returned error can't find the container with id 030df2e9cb3ae9427cc0d273e8ce28b545d43d2c885c176654089a906e59a53b Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.739179 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.753494 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.753470352 podStartE2EDuration="5.753470352s" podCreationTimestamp="2026-03-19 17:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:40.722936596 +0000 UTC m=+1312.845135844" watchObservedRunningTime="2026-03-19 17:01:40.753470352 +0000 UTC m=+1312.875669600" Mar 19 17:01:40 crc kubenswrapper[4918]: I0319 17:01:40.776823 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.7768070510000005 podStartE2EDuration="5.776807051s" podCreationTimestamp="2026-03-19 17:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:40.746693897 +0000 UTC m=+1312.868893145" watchObservedRunningTime="2026-03-19 17:01:40.776807051 +0000 UTC m=+1312.899006299" Mar 19 17:01:41 crc kubenswrapper[4918]: I0319 17:01:41.726607 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69dd59cf-aece-4caa-a2d1-85dfcc8c3306","Type":"ContainerStarted","Data":"a2fe84067d2d9f89429cfbaccd24b140935b90e1bae510723753e02b4d3710af"} Mar 19 17:01:41 crc kubenswrapper[4918]: I0319 17:01:41.726994 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69dd59cf-aece-4caa-a2d1-85dfcc8c3306","Type":"ContainerStarted","Data":"030df2e9cb3ae9427cc0d273e8ce28b545d43d2c885c176654089a906e59a53b"} Mar 19 17:01:42 crc kubenswrapper[4918]: I0319 17:01:42.738842 4918 generic.go:334] "Generic (PLEG): container finished" podID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerID="e906fb98ca9407a5968fe8dc8c8985c046afe8b86844f1744c2e2ed28051c005" exitCode=1 Mar 19 17:01:42 crc kubenswrapper[4918]: I0319 17:01:42.738914 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7472445-a896-40fb-a6d8-0d893db4fa45","Type":"ContainerDied","Data":"e906fb98ca9407a5968fe8dc8c8985c046afe8b86844f1744c2e2ed28051c005"} Mar 19 17:01:42 crc kubenswrapper[4918]: I0319 17:01:42.738966 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="sg-core" containerID="cri-o://4ee8218935f2144e465e82b29ec12c0224b2b3eeab6dee074dba41c0ac1416fc" gracePeriod=30 Mar 19 17:01:42 crc kubenswrapper[4918]: I0319 17:01:42.738961 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="ceilometer-central-agent" containerID="cri-o://a10773354733d273395fc50b877f5033db7d1836ca7e94d73f60b043be2330ce" gracePeriod=30 Mar 19 17:01:42 crc kubenswrapper[4918]: I0319 17:01:42.738994 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="ceilometer-notification-agent" containerID="cri-o://33c485d46e50ab0bce61ef730095f2ec08b2372c31a845190fd32711819d7d36" gracePeriod=30 Mar 19 17:01:42 crc kubenswrapper[4918]: I0319 17:01:42.741884 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69dd59cf-aece-4caa-a2d1-85dfcc8c3306","Type":"ContainerStarted","Data":"5b6647e634a8cec26c5a75266769ac4eb99d101f28f314d1fa794286d20db42a"} Mar 19 17:01:42 crc kubenswrapper[4918]: I0319 17:01:42.788692 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.788667281 podStartE2EDuration="3.788667281s" podCreationTimestamp="2026-03-19 17:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:01:42.779355056 +0000 UTC m=+1314.901554304" watchObservedRunningTime="2026-03-19 17:01:42.788667281 +0000 UTC m=+1314.910866529" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.491444 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6x2b9"] Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.493227 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.495198 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f5rj7" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.563089 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.563775 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.578449 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6x2b9"] Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.606719 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-config-data\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.606819 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9k5w\" (UniqueName: \"kubernetes.io/projected/64832abb-ee37-4a90-9fae-8eff52ff08e2-kube-api-access-j9k5w\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.606898 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.607011 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-scripts\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.709196 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-scripts\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.709640 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-config-data\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.709690 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9k5w\" (UniqueName: \"kubernetes.io/projected/64832abb-ee37-4a90-9fae-8eff52ff08e2-kube-api-access-j9k5w\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.709799 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.718388 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-scripts\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.719099 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.719282 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-config-data\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.728137 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9k5w\" (UniqueName: \"kubernetes.io/projected/64832abb-ee37-4a90-9fae-8eff52ff08e2-kube-api-access-j9k5w\") pod \"nova-cell0-conductor-db-sync-6x2b9\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.754120 4918 generic.go:334] "Generic (PLEG): container finished" podID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerID="4ee8218935f2144e465e82b29ec12c0224b2b3eeab6dee074dba41c0ac1416fc" exitCode=2 Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.755083 4918 generic.go:334] "Generic (PLEG): container finished" podID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerID="33c485d46e50ab0bce61ef730095f2ec08b2372c31a845190fd32711819d7d36" exitCode=0 Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.756120 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7472445-a896-40fb-a6d8-0d893db4fa45","Type":"ContainerDied","Data":"4ee8218935f2144e465e82b29ec12c0224b2b3eeab6dee074dba41c0ac1416fc"} Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.756244 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7472445-a896-40fb-a6d8-0d893db4fa45","Type":"ContainerDied","Data":"33c485d46e50ab0bce61ef730095f2ec08b2372c31a845190fd32711819d7d36"} Mar 19 17:01:43 crc kubenswrapper[4918]: I0319 17:01:43.883695 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:01:44 crc kubenswrapper[4918]: I0319 17:01:44.439358 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6x2b9"] Mar 19 17:01:44 crc kubenswrapper[4918]: I0319 17:01:44.769832 4918 generic.go:334] "Generic (PLEG): container finished" podID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerID="a10773354733d273395fc50b877f5033db7d1836ca7e94d73f60b043be2330ce" exitCode=0 Mar 19 17:01:44 crc kubenswrapper[4918]: I0319 17:01:44.769928 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7472445-a896-40fb-a6d8-0d893db4fa45","Type":"ContainerDied","Data":"a10773354733d273395fc50b877f5033db7d1836ca7e94d73f60b043be2330ce"} Mar 19 17:01:44 crc kubenswrapper[4918]: I0319 17:01:44.771583 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6x2b9" event={"ID":"64832abb-ee37-4a90-9fae-8eff52ff08e2","Type":"ContainerStarted","Data":"d01c9f644157834930ea246f0e710dbb61720ac1b8d3832acc33c96ba8f7b7c9"} Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.491197 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.514023 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.518389 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76f5474f44-brjsr" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.553121 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-log-httpd\") pod \"b7472445-a896-40fb-a6d8-0d893db4fa45\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.553558 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-sg-core-conf-yaml\") pod \"b7472445-a896-40fb-a6d8-0d893db4fa45\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.553754 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-run-httpd\") pod \"b7472445-a896-40fb-a6d8-0d893db4fa45\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.553883 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-scripts\") pod \"b7472445-a896-40fb-a6d8-0d893db4fa45\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.553935 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm5w5\" (UniqueName: \"kubernetes.io/projected/b7472445-a896-40fb-a6d8-0d893db4fa45-kube-api-access-wm5w5\") pod \"b7472445-a896-40fb-a6d8-0d893db4fa45\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.553998 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-config-data\") pod \"b7472445-a896-40fb-a6d8-0d893db4fa45\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.554067 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-combined-ca-bundle\") pod \"b7472445-a896-40fb-a6d8-0d893db4fa45\" (UID: \"b7472445-a896-40fb-a6d8-0d893db4fa45\") " Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.559607 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b7472445-a896-40fb-a6d8-0d893db4fa45" (UID: "b7472445-a896-40fb-a6d8-0d893db4fa45"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.562282 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b7472445-a896-40fb-a6d8-0d893db4fa45" (UID: "b7472445-a896-40fb-a6d8-0d893db4fa45"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.570197 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7472445-a896-40fb-a6d8-0d893db4fa45-kube-api-access-wm5w5" (OuterVolumeSpecName: "kube-api-access-wm5w5") pod "b7472445-a896-40fb-a6d8-0d893db4fa45" (UID: "b7472445-a896-40fb-a6d8-0d893db4fa45"). InnerVolumeSpecName "kube-api-access-wm5w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.570643 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-scripts" (OuterVolumeSpecName: "scripts") pod "b7472445-a896-40fb-a6d8-0d893db4fa45" (UID: "b7472445-a896-40fb-a6d8-0d893db4fa45"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.618709 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76c9778f96-f8hwv"] Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.618979 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76c9778f96-f8hwv" podUID="82332b68-a377-45a5-bf3c-d97caa5733ff" containerName="placement-log" containerID="cri-o://56cba120420d1dd05798e0a6c49de015a2196a387a7ea4e9e8f01e55c620f1be" gracePeriod=30 Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.619714 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76c9778f96-f8hwv" podUID="82332b68-a377-45a5-bf3c-d97caa5733ff" containerName="placement-api" containerID="cri-o://6ee4fd45ec8ddd5c01c993937a6eafe96329d67e1e4ee246a13c2def520983be" gracePeriod=30 Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.643905 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b7472445-a896-40fb-a6d8-0d893db4fa45" (UID: "b7472445-a896-40fb-a6d8-0d893db4fa45"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.657247 4918 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.657471 4918 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.657579 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.657641 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm5w5\" (UniqueName: \"kubernetes.io/projected/b7472445-a896-40fb-a6d8-0d893db4fa45-kube-api-access-wm5w5\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.658017 4918 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7472445-a896-40fb-a6d8-0d893db4fa45-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.734288 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7472445-a896-40fb-a6d8-0d893db4fa45" (UID: "b7472445-a896-40fb-a6d8-0d893db4fa45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.765258 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.778327 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-config-data" (OuterVolumeSpecName: "config-data") pod "b7472445-a896-40fb-a6d8-0d893db4fa45" (UID: "b7472445-a896-40fb-a6d8-0d893db4fa45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.798205 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7472445-a896-40fb-a6d8-0d893db4fa45","Type":"ContainerDied","Data":"65d7b2e63aac08e542f5c7ade9142bc95a93792f6e0d59407d8c7b7c487a2a8b"} Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.798262 4918 scope.go:117] "RemoveContainer" containerID="e906fb98ca9407a5968fe8dc8c8985c046afe8b86844f1744c2e2ed28051c005" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.798395 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.820549 4918 generic.go:334] "Generic (PLEG): container finished" podID="82332b68-a377-45a5-bf3c-d97caa5733ff" containerID="56cba120420d1dd05798e0a6c49de015a2196a387a7ea4e9e8f01e55c620f1be" exitCode=143 Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.821462 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c9778f96-f8hwv" event={"ID":"82332b68-a377-45a5-bf3c-d97caa5733ff","Type":"ContainerDied","Data":"56cba120420d1dd05798e0a6c49de015a2196a387a7ea4e9e8f01e55c620f1be"} Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.868918 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7472445-a896-40fb-a6d8-0d893db4fa45-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.908625 4918 scope.go:117] "RemoveContainer" containerID="4ee8218935f2144e465e82b29ec12c0224b2b3eeab6dee074dba41c0ac1416fc" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.918138 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.935717 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.950030 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:45 crc kubenswrapper[4918]: E0319 17:01:45.950574 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="sg-core" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.950597 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="sg-core" Mar 19 17:01:45 crc kubenswrapper[4918]: E0319 17:01:45.950627 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="proxy-httpd" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.950635 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="proxy-httpd" Mar 19 17:01:45 crc kubenswrapper[4918]: E0319 17:01:45.950649 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="ceilometer-central-agent" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.950658 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="ceilometer-central-agent" Mar 19 17:01:45 crc kubenswrapper[4918]: E0319 17:01:45.950682 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="ceilometer-notification-agent" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.950691 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="ceilometer-notification-agent" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.950904 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="ceilometer-notification-agent" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.950925 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="proxy-httpd" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.950945 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="ceilometer-central-agent" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.950960 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" containerName="sg-core" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.955074 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.961083 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.961356 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.961498 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.968234 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:45 crc kubenswrapper[4918]: I0319 17:01:45.990661 4918 scope.go:117] "RemoveContainer" containerID="33c485d46e50ab0bce61ef730095f2ec08b2372c31a845190fd32711819d7d36" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.052243 4918 scope.go:117] "RemoveContainer" containerID="a10773354733d273395fc50b877f5033db7d1836ca7e94d73f60b043be2330ce" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.073138 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.073200 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-log-httpd\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.073251 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-scripts\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.073348 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql84s\" (UniqueName: \"kubernetes.io/projected/316eab80-100b-43b8-aecf-c5fc6868b742-kube-api-access-ql84s\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.073380 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.073449 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-config-data\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.073538 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-run-httpd\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.073604 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.175453 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-config-data\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.175894 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-run-httpd\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.176417 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-run-httpd\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.176840 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.176914 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.176953 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-log-httpd\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.177023 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-scripts\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.177184 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql84s\" (UniqueName: \"kubernetes.io/projected/316eab80-100b-43b8-aecf-c5fc6868b742-kube-api-access-ql84s\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.177218 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.182142 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-log-httpd\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.184425 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.185456 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-scripts\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.186006 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.189647 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-config-data\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.201206 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql84s\" (UniqueName: \"kubernetes.io/projected/316eab80-100b-43b8-aecf-c5fc6868b742-kube-api-access-ql84s\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.233985 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.301761 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.324767 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.324814 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.366696 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.376430 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.609364 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7472445-a896-40fb-a6d8-0d893db4fa45" path="/var/lib/kubelet/pods/b7472445-a896-40fb-a6d8-0d893db4fa45/volumes" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.823411 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.857371 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 17:01:46 crc kubenswrapper[4918]: I0319 17:01:46.857488 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 17:01:47 crc kubenswrapper[4918]: I0319 17:01:47.015959 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 17:01:47 crc kubenswrapper[4918]: I0319 17:01:47.882407 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316eab80-100b-43b8-aecf-c5fc6868b742","Type":"ContainerStarted","Data":"14148b63633160aede4e5176ad219183348b0c523dc4597c21583f1903e3d18b"} Mar 19 17:01:48 crc kubenswrapper[4918]: I0319 17:01:48.907056 4918 generic.go:334] "Generic (PLEG): container finished" podID="82332b68-a377-45a5-bf3c-d97caa5733ff" containerID="6ee4fd45ec8ddd5c01c993937a6eafe96329d67e1e4ee246a13c2def520983be" exitCode=0 Mar 19 17:01:48 crc kubenswrapper[4918]: I0319 17:01:48.907234 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c9778f96-f8hwv" event={"ID":"82332b68-a377-45a5-bf3c-d97caa5733ff","Type":"ContainerDied","Data":"6ee4fd45ec8ddd5c01c993937a6eafe96329d67e1e4ee246a13c2def520983be"} Mar 19 17:01:48 crc kubenswrapper[4918]: I0319 17:01:48.916487 4918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 17:01:48 crc kubenswrapper[4918]: I0319 17:01:48.916803 4918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 17:01:48 crc kubenswrapper[4918]: I0319 17:01:48.916620 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316eab80-100b-43b8-aecf-c5fc6868b742","Type":"ContainerStarted","Data":"a0f2ba32ac1f2399604e11d66703f7ecf03f3e14f6668e70aefecef81caf7386"} Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.332052 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.461882 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg7sr\" (UniqueName: \"kubernetes.io/projected/82332b68-a377-45a5-bf3c-d97caa5733ff-kube-api-access-bg7sr\") pod \"82332b68-a377-45a5-bf3c-d97caa5733ff\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.462308 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-scripts\") pod \"82332b68-a377-45a5-bf3c-d97caa5733ff\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.462388 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-combined-ca-bundle\") pod \"82332b68-a377-45a5-bf3c-d97caa5733ff\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.462463 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-config-data\") pod \"82332b68-a377-45a5-bf3c-d97caa5733ff\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.462569 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82332b68-a377-45a5-bf3c-d97caa5733ff-logs\") pod \"82332b68-a377-45a5-bf3c-d97caa5733ff\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.462683 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-public-tls-certs\") pod \"82332b68-a377-45a5-bf3c-d97caa5733ff\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.462909 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-internal-tls-certs\") pod \"82332b68-a377-45a5-bf3c-d97caa5733ff\" (UID: \"82332b68-a377-45a5-bf3c-d97caa5733ff\") " Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.464912 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82332b68-a377-45a5-bf3c-d97caa5733ff-logs" (OuterVolumeSpecName: "logs") pod "82332b68-a377-45a5-bf3c-d97caa5733ff" (UID: "82332b68-a377-45a5-bf3c-d97caa5733ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.472763 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82332b68-a377-45a5-bf3c-d97caa5733ff-kube-api-access-bg7sr" (OuterVolumeSpecName: "kube-api-access-bg7sr") pod "82332b68-a377-45a5-bf3c-d97caa5733ff" (UID: "82332b68-a377-45a5-bf3c-d97caa5733ff"). InnerVolumeSpecName "kube-api-access-bg7sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.472815 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-scripts" (OuterVolumeSpecName: "scripts") pod "82332b68-a377-45a5-bf3c-d97caa5733ff" (UID: "82332b68-a377-45a5-bf3c-d97caa5733ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.535002 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.568303 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.568328 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82332b68-a377-45a5-bf3c-d97caa5733ff-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.568337 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg7sr\" (UniqueName: \"kubernetes.io/projected/82332b68-a377-45a5-bf3c-d97caa5733ff-kube-api-access-bg7sr\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.614013 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82332b68-a377-45a5-bf3c-d97caa5733ff" (UID: "82332b68-a377-45a5-bf3c-d97caa5733ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.615621 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-config-data" (OuterVolumeSpecName: "config-data") pod "82332b68-a377-45a5-bf3c-d97caa5733ff" (UID: "82332b68-a377-45a5-bf3c-d97caa5733ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.617500 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "82332b68-a377-45a5-bf3c-d97caa5733ff" (UID: "82332b68-a377-45a5-bf3c-d97caa5733ff"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.677328 4918 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.677361 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.677371 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.700684 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "82332b68-a377-45a5-bf3c-d97caa5733ff" (UID: "82332b68-a377-45a5-bf3c-d97caa5733ff"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.710713 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.724300 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.780051 4918 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82332b68-a377-45a5-bf3c-d97caa5733ff-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.881136 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.930154 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76c9778f96-f8hwv" event={"ID":"82332b68-a377-45a5-bf3c-d97caa5733ff","Type":"ContainerDied","Data":"87d3b97f565b8e1cacb1df2caf07570d12d00ecd367d658e7891fbb510b1eae3"} Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.930184 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76c9778f96-f8hwv" Mar 19 17:01:49 crc kubenswrapper[4918]: I0319 17:01:49.930215 4918 scope.go:117] "RemoveContainer" containerID="6ee4fd45ec8ddd5c01c993937a6eafe96329d67e1e4ee246a13c2def520983be" Mar 19 17:01:50 crc kubenswrapper[4918]: I0319 17:01:50.010828 4918 scope.go:117] "RemoveContainer" containerID="56cba120420d1dd05798e0a6c49de015a2196a387a7ea4e9e8f01e55c620f1be" Mar 19 17:01:50 crc kubenswrapper[4918]: I0319 17:01:50.030734 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76c9778f96-f8hwv"] Mar 19 17:01:50 crc kubenswrapper[4918]: I0319 17:01:50.055090 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-76c9778f96-f8hwv"] Mar 19 17:01:50 crc kubenswrapper[4918]: I0319 17:01:50.074680 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:50 crc kubenswrapper[4918]: I0319 17:01:50.074733 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:50 crc kubenswrapper[4918]: I0319 17:01:50.113498 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:50 crc kubenswrapper[4918]: I0319 17:01:50.140539 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:50 crc kubenswrapper[4918]: I0319 17:01:50.601346 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82332b68-a377-45a5-bf3c-d97caa5733ff" path="/var/lib/kubelet/pods/82332b68-a377-45a5-bf3c-d97caa5733ff/volumes" Mar 19 17:01:50 crc kubenswrapper[4918]: I0319 17:01:50.948073 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316eab80-100b-43b8-aecf-c5fc6868b742","Type":"ContainerStarted","Data":"68edc6b3240769361e8b259ad91c77e3489eef18006b29dc3643a6bd7b00391d"} Mar 19 17:01:50 crc kubenswrapper[4918]: I0319 17:01:50.948436 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:50 crc kubenswrapper[4918]: I0319 17:01:50.948454 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:51 crc kubenswrapper[4918]: I0319 17:01:51.040816 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cloudkitty-api-0" podUID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.195:8889/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 17:01:51 crc kubenswrapper[4918]: I0319 17:01:51.041261 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.195:8889/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 17:01:52 crc kubenswrapper[4918]: I0319 17:01:52.004108 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316eab80-100b-43b8-aecf-c5fc6868b742","Type":"ContainerStarted","Data":"d142a7ec8b858685e0ac4fed609f38c6d7044fdb2b11e2a4b23dab02a1dfcece"} Mar 19 17:01:53 crc kubenswrapper[4918]: I0319 17:01:53.905399 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:53 crc kubenswrapper[4918]: I0319 17:01:53.905824 4918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 17:01:54 crc kubenswrapper[4918]: I0319 17:01:54.073870 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 17:01:54 crc kubenswrapper[4918]: I0319 17:01:54.721008 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Mar 19 17:01:58 crc kubenswrapper[4918]: I0319 17:01:58.211776 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:01:58 crc kubenswrapper[4918]: I0319 17:01:58.212102 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.140432 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565662-dsxdv"] Mar 19 17:02:00 crc kubenswrapper[4918]: E0319 17:02:00.141322 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82332b68-a377-45a5-bf3c-d97caa5733ff" containerName="placement-log" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.141341 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="82332b68-a377-45a5-bf3c-d97caa5733ff" containerName="placement-log" Mar 19 17:02:00 crc kubenswrapper[4918]: E0319 17:02:00.141374 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82332b68-a377-45a5-bf3c-d97caa5733ff" containerName="placement-api" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.141385 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="82332b68-a377-45a5-bf3c-d97caa5733ff" containerName="placement-api" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.141654 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="82332b68-a377-45a5-bf3c-d97caa5733ff" containerName="placement-api" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.141692 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="82332b68-a377-45a5-bf3c-d97caa5733ff" containerName="placement-log" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.142807 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565662-dsxdv" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.147140 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.147219 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.147279 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.152608 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565662-dsxdv"] Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.215847 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsh6l\" (UniqueName: \"kubernetes.io/projected/5564c9bc-c959-4612-a27b-3b3272ca0bf3-kube-api-access-xsh6l\") pod \"auto-csr-approver-29565662-dsxdv\" (UID: \"5564c9bc-c959-4612-a27b-3b3272ca0bf3\") " pod="openshift-infra/auto-csr-approver-29565662-dsxdv" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.317796 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsh6l\" (UniqueName: \"kubernetes.io/projected/5564c9bc-c959-4612-a27b-3b3272ca0bf3-kube-api-access-xsh6l\") pod \"auto-csr-approver-29565662-dsxdv\" (UID: \"5564c9bc-c959-4612-a27b-3b3272ca0bf3\") " pod="openshift-infra/auto-csr-approver-29565662-dsxdv" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.342784 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsh6l\" (UniqueName: \"kubernetes.io/projected/5564c9bc-c959-4612-a27b-3b3272ca0bf3-kube-api-access-xsh6l\") pod \"auto-csr-approver-29565662-dsxdv\" (UID: \"5564c9bc-c959-4612-a27b-3b3272ca0bf3\") " pod="openshift-infra/auto-csr-approver-29565662-dsxdv" Mar 19 17:02:00 crc kubenswrapper[4918]: I0319 17:02:00.470167 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565662-dsxdv" Mar 19 17:02:01 crc kubenswrapper[4918]: I0319 17:02:01.091595 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565662-dsxdv"] Mar 19 17:02:01 crc kubenswrapper[4918]: W0319 17:02:01.093900 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5564c9bc_c959_4612_a27b_3b3272ca0bf3.slice/crio-b661f0ecc05f8da2e7c31369652bd5a106d2d1dcdeb813dfcf7077187c29c2ee WatchSource:0}: Error finding container b661f0ecc05f8da2e7c31369652bd5a106d2d1dcdeb813dfcf7077187c29c2ee: Status 404 returned error can't find the container with id b661f0ecc05f8da2e7c31369652bd5a106d2d1dcdeb813dfcf7077187c29c2ee Mar 19 17:02:01 crc kubenswrapper[4918]: I0319 17:02:01.111992 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6x2b9" event={"ID":"64832abb-ee37-4a90-9fae-8eff52ff08e2","Type":"ContainerStarted","Data":"c10802d2e613e8f6ad21c2d97f95480e1a21be7b013e652fc31377e981a53114"} Mar 19 17:02:01 crc kubenswrapper[4918]: I0319 17:02:01.121421 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316eab80-100b-43b8-aecf-c5fc6868b742","Type":"ContainerStarted","Data":"cdfed4b05ba5c658085a50d4aeaec811981ad45cc06862974b631cdd5bd1ab01"} Mar 19 17:02:01 crc kubenswrapper[4918]: I0319 17:02:01.121620 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 17:02:01 crc kubenswrapper[4918]: I0319 17:02:01.121686 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="sg-core" containerID="cri-o://d142a7ec8b858685e0ac4fed609f38c6d7044fdb2b11e2a4b23dab02a1dfcece" gracePeriod=30 Mar 19 17:02:01 crc kubenswrapper[4918]: I0319 17:02:01.121678 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="ceilometer-central-agent" containerID="cri-o://a0f2ba32ac1f2399604e11d66703f7ecf03f3e14f6668e70aefecef81caf7386" gracePeriod=30 Mar 19 17:02:01 crc kubenswrapper[4918]: I0319 17:02:01.121721 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="proxy-httpd" containerID="cri-o://cdfed4b05ba5c658085a50d4aeaec811981ad45cc06862974b631cdd5bd1ab01" gracePeriod=30 Mar 19 17:02:01 crc kubenswrapper[4918]: I0319 17:02:01.121726 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="ceilometer-notification-agent" containerID="cri-o://68edc6b3240769361e8b259ad91c77e3489eef18006b29dc3643a6bd7b00391d" gracePeriod=30 Mar 19 17:02:01 crc kubenswrapper[4918]: I0319 17:02:01.137644 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6x2b9" podStartSLOduration=2.223021005 podStartE2EDuration="18.137624975s" podCreationTimestamp="2026-03-19 17:01:43 +0000 UTC" firstStartedPulling="2026-03-19 17:01:44.440367326 +0000 UTC m=+1316.562566574" lastFinishedPulling="2026-03-19 17:02:00.354971296 +0000 UTC m=+1332.477170544" observedRunningTime="2026-03-19 17:02:01.128589148 +0000 UTC m=+1333.250788396" watchObservedRunningTime="2026-03-19 17:02:01.137624975 +0000 UTC m=+1333.259824223" Mar 19 17:02:01 crc kubenswrapper[4918]: I0319 17:02:01.166548 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.669279644 podStartE2EDuration="16.166507437s" podCreationTimestamp="2026-03-19 17:01:45 +0000 UTC" firstStartedPulling="2026-03-19 17:01:46.857702402 +0000 UTC m=+1318.979901650" lastFinishedPulling="2026-03-19 17:02:00.354930195 +0000 UTC m=+1332.477129443" observedRunningTime="2026-03-19 17:02:01.163848994 +0000 UTC m=+1333.286048242" watchObservedRunningTime="2026-03-19 17:02:01.166507437 +0000 UTC m=+1333.288706685" Mar 19 17:02:02 crc kubenswrapper[4918]: I0319 17:02:02.132374 4918 generic.go:334] "Generic (PLEG): container finished" podID="316eab80-100b-43b8-aecf-c5fc6868b742" containerID="cdfed4b05ba5c658085a50d4aeaec811981ad45cc06862974b631cdd5bd1ab01" exitCode=0 Mar 19 17:02:02 crc kubenswrapper[4918]: I0319 17:02:02.133334 4918 generic.go:334] "Generic (PLEG): container finished" podID="316eab80-100b-43b8-aecf-c5fc6868b742" containerID="d142a7ec8b858685e0ac4fed609f38c6d7044fdb2b11e2a4b23dab02a1dfcece" exitCode=2 Mar 19 17:02:02 crc kubenswrapper[4918]: I0319 17:02:02.133404 4918 generic.go:334] "Generic (PLEG): container finished" podID="316eab80-100b-43b8-aecf-c5fc6868b742" containerID="68edc6b3240769361e8b259ad91c77e3489eef18006b29dc3643a6bd7b00391d" exitCode=0 Mar 19 17:02:02 crc kubenswrapper[4918]: I0319 17:02:02.132455 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316eab80-100b-43b8-aecf-c5fc6868b742","Type":"ContainerDied","Data":"cdfed4b05ba5c658085a50d4aeaec811981ad45cc06862974b631cdd5bd1ab01"} Mar 19 17:02:02 crc kubenswrapper[4918]: I0319 17:02:02.133610 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316eab80-100b-43b8-aecf-c5fc6868b742","Type":"ContainerDied","Data":"d142a7ec8b858685e0ac4fed609f38c6d7044fdb2b11e2a4b23dab02a1dfcece"} Mar 19 17:02:02 crc kubenswrapper[4918]: I0319 17:02:02.133676 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316eab80-100b-43b8-aecf-c5fc6868b742","Type":"ContainerDied","Data":"68edc6b3240769361e8b259ad91c77e3489eef18006b29dc3643a6bd7b00391d"} Mar 19 17:02:02 crc kubenswrapper[4918]: I0319 17:02:02.134894 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565662-dsxdv" event={"ID":"5564c9bc-c959-4612-a27b-3b3272ca0bf3","Type":"ContainerStarted","Data":"b661f0ecc05f8da2e7c31369652bd5a106d2d1dcdeb813dfcf7077187c29c2ee"} Mar 19 17:02:03 crc kubenswrapper[4918]: I0319 17:02:03.146265 4918 generic.go:334] "Generic (PLEG): container finished" podID="5564c9bc-c959-4612-a27b-3b3272ca0bf3" containerID="3f06b127cd2b4cef4cae2f38e797f7424a225679cd943c480ce30ea4f6303371" exitCode=0 Mar 19 17:02:03 crc kubenswrapper[4918]: I0319 17:02:03.146317 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565662-dsxdv" event={"ID":"5564c9bc-c959-4612-a27b-3b3272ca0bf3","Type":"ContainerDied","Data":"3f06b127cd2b4cef4cae2f38e797f7424a225679cd943c480ce30ea4f6303371"} Mar 19 17:02:04 crc kubenswrapper[4918]: I0319 17:02:04.604908 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565662-dsxdv" Mar 19 17:02:04 crc kubenswrapper[4918]: I0319 17:02:04.721783 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsh6l\" (UniqueName: \"kubernetes.io/projected/5564c9bc-c959-4612-a27b-3b3272ca0bf3-kube-api-access-xsh6l\") pod \"5564c9bc-c959-4612-a27b-3b3272ca0bf3\" (UID: \"5564c9bc-c959-4612-a27b-3b3272ca0bf3\") " Mar 19 17:02:04 crc kubenswrapper[4918]: I0319 17:02:04.736330 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5564c9bc-c959-4612-a27b-3b3272ca0bf3-kube-api-access-xsh6l" (OuterVolumeSpecName: "kube-api-access-xsh6l") pod "5564c9bc-c959-4612-a27b-3b3272ca0bf3" (UID: "5564c9bc-c959-4612-a27b-3b3272ca0bf3"). InnerVolumeSpecName "kube-api-access-xsh6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:02:04 crc kubenswrapper[4918]: I0319 17:02:04.824560 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsh6l\" (UniqueName: \"kubernetes.io/projected/5564c9bc-c959-4612-a27b-3b3272ca0bf3-kube-api-access-xsh6l\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.170684 4918 generic.go:334] "Generic (PLEG): container finished" podID="316eab80-100b-43b8-aecf-c5fc6868b742" containerID="a0f2ba32ac1f2399604e11d66703f7ecf03f3e14f6668e70aefecef81caf7386" exitCode=0 Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.170774 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316eab80-100b-43b8-aecf-c5fc6868b742","Type":"ContainerDied","Data":"a0f2ba32ac1f2399604e11d66703f7ecf03f3e14f6668e70aefecef81caf7386"} Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.173298 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565662-dsxdv" event={"ID":"5564c9bc-c959-4612-a27b-3b3272ca0bf3","Type":"ContainerDied","Data":"b661f0ecc05f8da2e7c31369652bd5a106d2d1dcdeb813dfcf7077187c29c2ee"} Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.173320 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b661f0ecc05f8da2e7c31369652bd5a106d2d1dcdeb813dfcf7077187c29c2ee" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.173389 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565662-dsxdv" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.278540 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.434842 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-sg-core-conf-yaml\") pod \"316eab80-100b-43b8-aecf-c5fc6868b742\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.434894 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-log-httpd\") pod \"316eab80-100b-43b8-aecf-c5fc6868b742\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.435023 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql84s\" (UniqueName: \"kubernetes.io/projected/316eab80-100b-43b8-aecf-c5fc6868b742-kube-api-access-ql84s\") pod \"316eab80-100b-43b8-aecf-c5fc6868b742\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.435067 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-config-data\") pod \"316eab80-100b-43b8-aecf-c5fc6868b742\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.435134 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-run-httpd\") pod \"316eab80-100b-43b8-aecf-c5fc6868b742\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.435159 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-ceilometer-tls-certs\") pod \"316eab80-100b-43b8-aecf-c5fc6868b742\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.435261 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-combined-ca-bundle\") pod \"316eab80-100b-43b8-aecf-c5fc6868b742\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.435309 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-scripts\") pod \"316eab80-100b-43b8-aecf-c5fc6868b742\" (UID: \"316eab80-100b-43b8-aecf-c5fc6868b742\") " Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.435866 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "316eab80-100b-43b8-aecf-c5fc6868b742" (UID: "316eab80-100b-43b8-aecf-c5fc6868b742"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.436135 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "316eab80-100b-43b8-aecf-c5fc6868b742" (UID: "316eab80-100b-43b8-aecf-c5fc6868b742"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.436977 4918 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.437009 4918 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/316eab80-100b-43b8-aecf-c5fc6868b742-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.439843 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-scripts" (OuterVolumeSpecName: "scripts") pod "316eab80-100b-43b8-aecf-c5fc6868b742" (UID: "316eab80-100b-43b8-aecf-c5fc6868b742"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.442017 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316eab80-100b-43b8-aecf-c5fc6868b742-kube-api-access-ql84s" (OuterVolumeSpecName: "kube-api-access-ql84s") pod "316eab80-100b-43b8-aecf-c5fc6868b742" (UID: "316eab80-100b-43b8-aecf-c5fc6868b742"). InnerVolumeSpecName "kube-api-access-ql84s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.468980 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "316eab80-100b-43b8-aecf-c5fc6868b742" (UID: "316eab80-100b-43b8-aecf-c5fc6868b742"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.502787 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "316eab80-100b-43b8-aecf-c5fc6868b742" (UID: "316eab80-100b-43b8-aecf-c5fc6868b742"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.510925 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "316eab80-100b-43b8-aecf-c5fc6868b742" (UID: "316eab80-100b-43b8-aecf-c5fc6868b742"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.539614 4918 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.539868 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.539987 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.540090 4918 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.540186 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql84s\" (UniqueName: \"kubernetes.io/projected/316eab80-100b-43b8-aecf-c5fc6868b742-kube-api-access-ql84s\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.550744 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-config-data" (OuterVolumeSpecName: "config-data") pod "316eab80-100b-43b8-aecf-c5fc6868b742" (UID: "316eab80-100b-43b8-aecf-c5fc6868b742"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.642556 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316eab80-100b-43b8-aecf-c5fc6868b742-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.682341 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565656-p46j9"] Mar 19 17:02:05 crc kubenswrapper[4918]: I0319 17:02:05.693638 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565656-p46j9"] Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.185938 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"316eab80-100b-43b8-aecf-c5fc6868b742","Type":"ContainerDied","Data":"14148b63633160aede4e5176ad219183348b0c523dc4597c21583f1903e3d18b"} Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.185999 4918 scope.go:117] "RemoveContainer" containerID="cdfed4b05ba5c658085a50d4aeaec811981ad45cc06862974b631cdd5bd1ab01" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.186170 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.215576 4918 scope.go:117] "RemoveContainer" containerID="d142a7ec8b858685e0ac4fed609f38c6d7044fdb2b11e2a4b23dab02a1dfcece" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.233931 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.258146 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.263751 4918 scope.go:117] "RemoveContainer" containerID="68edc6b3240769361e8b259ad91c77e3489eef18006b29dc3643a6bd7b00391d" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.277763 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:02:06 crc kubenswrapper[4918]: E0319 17:02:06.279139 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="sg-core" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.279162 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="sg-core" Mar 19 17:02:06 crc kubenswrapper[4918]: E0319 17:02:06.279193 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="ceilometer-central-agent" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.279201 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="ceilometer-central-agent" Mar 19 17:02:06 crc kubenswrapper[4918]: E0319 17:02:06.279216 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="proxy-httpd" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.279223 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="proxy-httpd" Mar 19 17:02:06 crc kubenswrapper[4918]: E0319 17:02:06.279239 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="ceilometer-notification-agent" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.279286 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="ceilometer-notification-agent" Mar 19 17:02:06 crc kubenswrapper[4918]: E0319 17:02:06.279320 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5564c9bc-c959-4612-a27b-3b3272ca0bf3" containerName="oc" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.279328 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="5564c9bc-c959-4612-a27b-3b3272ca0bf3" containerName="oc" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.279689 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="5564c9bc-c959-4612-a27b-3b3272ca0bf3" containerName="oc" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.279713 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="ceilometer-notification-agent" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.279732 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="ceilometer-central-agent" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.279758 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="proxy-httpd" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.279773 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" containerName="sg-core" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.284490 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.288733 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.289025 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.289273 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.299550 4918 scope.go:117] "RemoveContainer" containerID="a0f2ba32ac1f2399604e11d66703f7ecf03f3e14f6668e70aefecef81caf7386" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.313058 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.459611 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-scripts\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.459661 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.459690 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj549\" (UniqueName: \"kubernetes.io/projected/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-kube-api-access-sj549\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.459717 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.459757 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-run-httpd\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.459815 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-config-data\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.459877 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-log-httpd\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.459894 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.561637 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-run-httpd\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.561728 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-config-data\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.561795 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-log-httpd\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.561815 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.561895 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-scripts\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.561915 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.561943 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj549\" (UniqueName: \"kubernetes.io/projected/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-kube-api-access-sj549\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.561967 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.563670 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-log-httpd\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.564040 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-run-httpd\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.566270 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-scripts\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.567699 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.568158 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.570506 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-config-data\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.585498 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.590181 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj549\" (UniqueName: \"kubernetes.io/projected/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-kube-api-access-sj549\") pod \"ceilometer-0\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " pod="openstack/ceilometer-0" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.603674 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316eab80-100b-43b8-aecf-c5fc6868b742" path="/var/lib/kubelet/pods/316eab80-100b-43b8-aecf-c5fc6868b742/volumes" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.614632 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb680de7-217b-493c-89c3-81f0b3e801fb" path="/var/lib/kubelet/pods/cb680de7-217b-493c-89c3-81f0b3e801fb/volumes" Mar 19 17:02:06 crc kubenswrapper[4918]: I0319 17:02:06.631843 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:02:07 crc kubenswrapper[4918]: I0319 17:02:07.103195 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:02:07 crc kubenswrapper[4918]: W0319 17:02:07.110492 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f1d4e83_b6fb_4cb6_8d59_51d5e46cb991.slice/crio-dbdbd531915031f88ec60c209d3b98ca1865589053ff24d68f853b432ce5e362 WatchSource:0}: Error finding container dbdbd531915031f88ec60c209d3b98ca1865589053ff24d68f853b432ce5e362: Status 404 returned error can't find the container with id dbdbd531915031f88ec60c209d3b98ca1865589053ff24d68f853b432ce5e362 Mar 19 17:02:07 crc kubenswrapper[4918]: I0319 17:02:07.199383 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991","Type":"ContainerStarted","Data":"dbdbd531915031f88ec60c209d3b98ca1865589053ff24d68f853b432ce5e362"} Mar 19 17:02:08 crc kubenswrapper[4918]: I0319 17:02:08.214053 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991","Type":"ContainerStarted","Data":"1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9"} Mar 19 17:02:09 crc kubenswrapper[4918]: I0319 17:02:09.230692 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991","Type":"ContainerStarted","Data":"fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3"} Mar 19 17:02:10 crc kubenswrapper[4918]: I0319 17:02:10.247588 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991","Type":"ContainerStarted","Data":"e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff"} Mar 19 17:02:14 crc kubenswrapper[4918]: I0319 17:02:14.306914 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991","Type":"ContainerStarted","Data":"4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b"} Mar 19 17:02:14 crc kubenswrapper[4918]: I0319 17:02:14.307397 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 17:02:14 crc kubenswrapper[4918]: I0319 17:02:14.308304 4918 generic.go:334] "Generic (PLEG): container finished" podID="64832abb-ee37-4a90-9fae-8eff52ff08e2" containerID="c10802d2e613e8f6ad21c2d97f95480e1a21be7b013e652fc31377e981a53114" exitCode=0 Mar 19 17:02:14 crc kubenswrapper[4918]: I0319 17:02:14.308326 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6x2b9" event={"ID":"64832abb-ee37-4a90-9fae-8eff52ff08e2","Type":"ContainerDied","Data":"c10802d2e613e8f6ad21c2d97f95480e1a21be7b013e652fc31377e981a53114"} Mar 19 17:02:14 crc kubenswrapper[4918]: I0319 17:02:14.342452 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.139877885 podStartE2EDuration="8.342432168s" podCreationTimestamp="2026-03-19 17:02:06 +0000 UTC" firstStartedPulling="2026-03-19 17:02:07.113043147 +0000 UTC m=+1339.235242395" lastFinishedPulling="2026-03-19 17:02:13.31559742 +0000 UTC m=+1345.437796678" observedRunningTime="2026-03-19 17:02:14.330432059 +0000 UTC m=+1346.452631317" watchObservedRunningTime="2026-03-19 17:02:14.342432168 +0000 UTC m=+1346.464631416" Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.738250 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.853597 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-config-data\") pod \"64832abb-ee37-4a90-9fae-8eff52ff08e2\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.853674 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-scripts\") pod \"64832abb-ee37-4a90-9fae-8eff52ff08e2\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.853742 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-combined-ca-bundle\") pod \"64832abb-ee37-4a90-9fae-8eff52ff08e2\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.853882 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9k5w\" (UniqueName: \"kubernetes.io/projected/64832abb-ee37-4a90-9fae-8eff52ff08e2-kube-api-access-j9k5w\") pod \"64832abb-ee37-4a90-9fae-8eff52ff08e2\" (UID: \"64832abb-ee37-4a90-9fae-8eff52ff08e2\") " Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.859979 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64832abb-ee37-4a90-9fae-8eff52ff08e2-kube-api-access-j9k5w" (OuterVolumeSpecName: "kube-api-access-j9k5w") pod "64832abb-ee37-4a90-9fae-8eff52ff08e2" (UID: "64832abb-ee37-4a90-9fae-8eff52ff08e2"). InnerVolumeSpecName "kube-api-access-j9k5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.860327 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-scripts" (OuterVolumeSpecName: "scripts") pod "64832abb-ee37-4a90-9fae-8eff52ff08e2" (UID: "64832abb-ee37-4a90-9fae-8eff52ff08e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.895914 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-config-data" (OuterVolumeSpecName: "config-data") pod "64832abb-ee37-4a90-9fae-8eff52ff08e2" (UID: "64832abb-ee37-4a90-9fae-8eff52ff08e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.924737 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64832abb-ee37-4a90-9fae-8eff52ff08e2" (UID: "64832abb-ee37-4a90-9fae-8eff52ff08e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.956272 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.956302 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.956314 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64832abb-ee37-4a90-9fae-8eff52ff08e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:15 crc kubenswrapper[4918]: I0319 17:02:15.956331 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9k5w\" (UniqueName: \"kubernetes.io/projected/64832abb-ee37-4a90-9fae-8eff52ff08e2-kube-api-access-j9k5w\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.327053 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6x2b9" event={"ID":"64832abb-ee37-4a90-9fae-8eff52ff08e2","Type":"ContainerDied","Data":"d01c9f644157834930ea246f0e710dbb61720ac1b8d3832acc33c96ba8f7b7c9"} Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.327347 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d01c9f644157834930ea246f0e710dbb61720ac1b8d3832acc33c96ba8f7b7c9" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.327121 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6x2b9" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.501135 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 17:02:16 crc kubenswrapper[4918]: E0319 17:02:16.501672 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64832abb-ee37-4a90-9fae-8eff52ff08e2" containerName="nova-cell0-conductor-db-sync" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.501706 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="64832abb-ee37-4a90-9fae-8eff52ff08e2" containerName="nova-cell0-conductor-db-sync" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.501995 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="64832abb-ee37-4a90-9fae-8eff52ff08e2" containerName="nova-cell0-conductor-db-sync" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.502930 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.506077 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.520979 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f5rj7" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.532223 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.567270 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a9c491-28ae-4d91-8366-7e449bbf8d8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"05a9c491-28ae-4d91-8366-7e449bbf8d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.567376 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65p7\" (UniqueName: \"kubernetes.io/projected/05a9c491-28ae-4d91-8366-7e449bbf8d8e-kube-api-access-w65p7\") pod \"nova-cell0-conductor-0\" (UID: \"05a9c491-28ae-4d91-8366-7e449bbf8d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.567560 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a9c491-28ae-4d91-8366-7e449bbf8d8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"05a9c491-28ae-4d91-8366-7e449bbf8d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.669581 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a9c491-28ae-4d91-8366-7e449bbf8d8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"05a9c491-28ae-4d91-8366-7e449bbf8d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.669782 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a9c491-28ae-4d91-8366-7e449bbf8d8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"05a9c491-28ae-4d91-8366-7e449bbf8d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.669864 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w65p7\" (UniqueName: \"kubernetes.io/projected/05a9c491-28ae-4d91-8366-7e449bbf8d8e-kube-api-access-w65p7\") pod \"nova-cell0-conductor-0\" (UID: \"05a9c491-28ae-4d91-8366-7e449bbf8d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.677016 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a9c491-28ae-4d91-8366-7e449bbf8d8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"05a9c491-28ae-4d91-8366-7e449bbf8d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.677088 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a9c491-28ae-4d91-8366-7e449bbf8d8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"05a9c491-28ae-4d91-8366-7e449bbf8d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.708063 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65p7\" (UniqueName: \"kubernetes.io/projected/05a9c491-28ae-4d91-8366-7e449bbf8d8e-kube-api-access-w65p7\") pod \"nova-cell0-conductor-0\" (UID: \"05a9c491-28ae-4d91-8366-7e449bbf8d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:16 crc kubenswrapper[4918]: I0319 17:02:16.824449 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:17 crc kubenswrapper[4918]: I0319 17:02:17.361537 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 17:02:17 crc kubenswrapper[4918]: I0319 17:02:17.387125 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"05a9c491-28ae-4d91-8366-7e449bbf8d8e","Type":"ContainerStarted","Data":"91f725a6ac3e482194cb3069e2711326723c9ff7dcf7c6fe537c449006c9e485"} Mar 19 17:02:18 crc kubenswrapper[4918]: I0319 17:02:18.396558 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"05a9c491-28ae-4d91-8366-7e449bbf8d8e","Type":"ContainerStarted","Data":"c0b0d87d87a28e6b236c979adb83e6477d6928659385938171e9ccf4e34d52a4"} Mar 19 17:02:18 crc kubenswrapper[4918]: I0319 17:02:18.397056 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:18 crc kubenswrapper[4918]: I0319 17:02:18.422061 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.422039629 podStartE2EDuration="2.422039629s" podCreationTimestamp="2026-03-19 17:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:02:18.410623956 +0000 UTC m=+1350.532823204" watchObservedRunningTime="2026-03-19 17:02:18.422039629 +0000 UTC m=+1350.544238877" Mar 19 17:02:20 crc kubenswrapper[4918]: I0319 17:02:20.749807 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:02:20 crc kubenswrapper[4918]: I0319 17:02:20.750241 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="ceilometer-central-agent" containerID="cri-o://1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9" gracePeriod=30 Mar 19 17:02:20 crc kubenswrapper[4918]: I0319 17:02:20.750301 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="sg-core" containerID="cri-o://e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff" gracePeriod=30 Mar 19 17:02:20 crc kubenswrapper[4918]: I0319 17:02:20.750320 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="proxy-httpd" containerID="cri-o://4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b" gracePeriod=30 Mar 19 17:02:20 crc kubenswrapper[4918]: I0319 17:02:20.750372 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="ceilometer-notification-agent" containerID="cri-o://fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3" gracePeriod=30 Mar 19 17:02:21 crc kubenswrapper[4918]: I0319 17:02:21.426238 4918 generic.go:334] "Generic (PLEG): container finished" podID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerID="4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b" exitCode=0 Mar 19 17:02:21 crc kubenswrapper[4918]: I0319 17:02:21.426670 4918 generic.go:334] "Generic (PLEG): container finished" podID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerID="e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff" exitCode=2 Mar 19 17:02:21 crc kubenswrapper[4918]: I0319 17:02:21.426698 4918 generic.go:334] "Generic (PLEG): container finished" podID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerID="1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9" exitCode=0 Mar 19 17:02:21 crc kubenswrapper[4918]: I0319 17:02:21.426325 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991","Type":"ContainerDied","Data":"4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b"} Mar 19 17:02:21 crc kubenswrapper[4918]: I0319 17:02:21.426766 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991","Type":"ContainerDied","Data":"e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff"} Mar 19 17:02:21 crc kubenswrapper[4918]: I0319 17:02:21.426800 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991","Type":"ContainerDied","Data":"1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9"} Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.184270 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.269193 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-log-httpd\") pod \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.269237 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-config-data\") pod \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.269269 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj549\" (UniqueName: \"kubernetes.io/projected/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-kube-api-access-sj549\") pod \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.269312 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-combined-ca-bundle\") pod \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.269387 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-ceilometer-tls-certs\") pod \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.269461 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-scripts\") pod \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.269481 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-run-httpd\") pod \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.269564 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-sg-core-conf-yaml\") pod \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\" (UID: \"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991\") " Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.269912 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" (UID: "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.270017 4918 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.271753 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" (UID: "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.275591 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-kube-api-access-sj549" (OuterVolumeSpecName: "kube-api-access-sj549") pod "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" (UID: "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991"). InnerVolumeSpecName "kube-api-access-sj549". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.277930 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-scripts" (OuterVolumeSpecName: "scripts") pod "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" (UID: "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.345471 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" (UID: "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.376198 4918 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.376230 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj549\" (UniqueName: \"kubernetes.io/projected/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-kube-api-access-sj549\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.376239 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.376249 4918 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.417237 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" (UID: "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.422123 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" (UID: "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.441865 4918 generic.go:334] "Generic (PLEG): container finished" podID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerID="fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3" exitCode=0 Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.441905 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991","Type":"ContainerDied","Data":"fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3"} Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.441932 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991","Type":"ContainerDied","Data":"dbdbd531915031f88ec60c209d3b98ca1865589053ff24d68f853b432ce5e362"} Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.441947 4918 scope.go:117] "RemoveContainer" containerID="4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.442069 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.462857 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-config-data" (OuterVolumeSpecName: "config-data") pod "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" (UID: "7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.471387 4918 scope.go:117] "RemoveContainer" containerID="e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.477807 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.477840 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.477857 4918 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.495884 4918 scope.go:117] "RemoveContainer" containerID="fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.515137 4918 scope.go:117] "RemoveContainer" containerID="1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.539417 4918 scope.go:117] "RemoveContainer" containerID="4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b" Mar 19 17:02:22 crc kubenswrapper[4918]: E0319 17:02:22.539748 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b\": container with ID starting with 4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b not found: ID does not exist" containerID="4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.539775 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b"} err="failed to get container status \"4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b\": rpc error: code = NotFound desc = could not find container \"4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b\": container with ID starting with 4ebc0a3ae80362bce5d88dc1001888df9f8eec5be560ec6ffdea7ea5f585107b not found: ID does not exist" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.539793 4918 scope.go:117] "RemoveContainer" containerID="e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff" Mar 19 17:02:22 crc kubenswrapper[4918]: E0319 17:02:22.539986 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff\": container with ID starting with e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff not found: ID does not exist" containerID="e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.540007 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff"} err="failed to get container status \"e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff\": rpc error: code = NotFound desc = could not find container \"e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff\": container with ID starting with e9451e14e52894f95e15ec3864eeed2e4cd8167b109cc5c5976449cdf892f1ff not found: ID does not exist" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.540019 4918 scope.go:117] "RemoveContainer" containerID="fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3" Mar 19 17:02:22 crc kubenswrapper[4918]: E0319 17:02:22.540221 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3\": container with ID starting with fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3 not found: ID does not exist" containerID="fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.540240 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3"} err="failed to get container status \"fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3\": rpc error: code = NotFound desc = could not find container \"fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3\": container with ID starting with fbeda504306c606057046f59a417b429e670fd5269a99f50b100eac3ff11f6c3 not found: ID does not exist" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.540261 4918 scope.go:117] "RemoveContainer" containerID="1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9" Mar 19 17:02:22 crc kubenswrapper[4918]: E0319 17:02:22.540432 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9\": container with ID starting with 1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9 not found: ID does not exist" containerID="1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.540458 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9"} err="failed to get container status \"1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9\": rpc error: code = NotFound desc = could not find container \"1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9\": container with ID starting with 1a1ddd0e0dd83997c37633b85dc11f195bdae674e008b6854a1baee8da1e47d9 not found: ID does not exist" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.765134 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.775651 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.791622 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:02:22 crc kubenswrapper[4918]: E0319 17:02:22.792045 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="proxy-httpd" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.792058 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="proxy-httpd" Mar 19 17:02:22 crc kubenswrapper[4918]: E0319 17:02:22.792087 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="sg-core" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.792094 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="sg-core" Mar 19 17:02:22 crc kubenswrapper[4918]: E0319 17:02:22.792113 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="ceilometer-central-agent" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.792120 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="ceilometer-central-agent" Mar 19 17:02:22 crc kubenswrapper[4918]: E0319 17:02:22.792131 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="ceilometer-notification-agent" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.792137 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="ceilometer-notification-agent" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.792322 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="proxy-httpd" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.792349 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="ceilometer-central-agent" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.792364 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="sg-core" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.792379 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" containerName="ceilometer-notification-agent" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.794544 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.796437 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.796604 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.796710 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.803704 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.887050 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.887210 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.887394 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.887471 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-scripts\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.888976 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-log-httpd\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.889011 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w79d5\" (UniqueName: \"kubernetes.io/projected/355bfabe-8824-4ee1-9bf5-8c808a15fa70-kube-api-access-w79d5\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.889067 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-run-httpd\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.889276 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-config-data\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.991263 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-log-httpd\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.991312 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w79d5\" (UniqueName: \"kubernetes.io/projected/355bfabe-8824-4ee1-9bf5-8c808a15fa70-kube-api-access-w79d5\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.991340 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-run-httpd\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.991406 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-config-data\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.991456 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.991534 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.991589 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.991611 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-scripts\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.992451 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-run-httpd\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.992585 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-log-httpd\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.995244 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-scripts\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.995251 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.995378 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.995757 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:22 crc kubenswrapper[4918]: I0319 17:02:22.997399 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-config-data\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:23 crc kubenswrapper[4918]: I0319 17:02:23.018121 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w79d5\" (UniqueName: \"kubernetes.io/projected/355bfabe-8824-4ee1-9bf5-8c808a15fa70-kube-api-access-w79d5\") pod \"ceilometer-0\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " pod="openstack/ceilometer-0" Mar 19 17:02:23 crc kubenswrapper[4918]: I0319 17:02:23.110019 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:02:23 crc kubenswrapper[4918]: I0319 17:02:23.585652 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:02:24 crc kubenswrapper[4918]: I0319 17:02:24.467018 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"355bfabe-8824-4ee1-9bf5-8c808a15fa70","Type":"ContainerStarted","Data":"2332da299a6016bc1e4e3225394e23c2e0a60415941caa5b25083eaaa19ba94a"} Mar 19 17:02:24 crc kubenswrapper[4918]: I0319 17:02:24.467623 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"355bfabe-8824-4ee1-9bf5-8c808a15fa70","Type":"ContainerStarted","Data":"9d5501e5f7e95842c8cb93da7e7c18df2c93aae8316c74f1ce4194d4b1cc7b00"} Mar 19 17:02:24 crc kubenswrapper[4918]: I0319 17:02:24.600604 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991" path="/var/lib/kubelet/pods/7f1d4e83-b6fb-4cb6-8d59-51d5e46cb991/volumes" Mar 19 17:02:25 crc kubenswrapper[4918]: I0319 17:02:25.476977 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"355bfabe-8824-4ee1-9bf5-8c808a15fa70","Type":"ContainerStarted","Data":"dc17de0ca55333c2eb30c00cdbb4fe25b567faf113a8e1b2b604547e126de418"} Mar 19 17:02:26 crc kubenswrapper[4918]: I0319 17:02:26.487064 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"355bfabe-8824-4ee1-9bf5-8c808a15fa70","Type":"ContainerStarted","Data":"dcc05602faa287a2f356b257bcdfb941489ebba134aa714375178be7494c2e40"} Mar 19 17:02:26 crc kubenswrapper[4918]: I0319 17:02:26.857287 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.490929 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-nkvm2"] Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.493656 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.495286 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.495850 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.503503 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-nkvm2"] Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.603924 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-scripts\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.604001 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkwfk\" (UniqueName: \"kubernetes.io/projected/780adca3-416f-431d-915e-7a546cfeae43-kube-api-access-hkwfk\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.604142 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.604201 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-config-data\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.701494 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.703454 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.705833 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-config-data\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.706036 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-scripts\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.706147 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkwfk\" (UniqueName: \"kubernetes.io/projected/780adca3-416f-431d-915e-7a546cfeae43-kube-api-access-hkwfk\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.706290 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.706722 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.721571 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-config-data\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.723272 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-scripts\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.742292 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.779073 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.798486 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkwfk\" (UniqueName: \"kubernetes.io/projected/780adca3-416f-431d-915e-7a546cfeae43-kube-api-access-hkwfk\") pod \"nova-cell0-cell-mapping-nkvm2\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.811055 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.811149 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-config-data\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.811184 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea51a4c-c5eb-414d-a46b-cc704ff34914-logs\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.811308 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56sls\" (UniqueName: \"kubernetes.io/projected/eea51a4c-c5eb-414d-a46b-cc704ff34914-kube-api-access-56sls\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.822030 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.826493 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.885736 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.892869 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.941367 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x58f7\" (UniqueName: \"kubernetes.io/projected/971dce2f-fa89-4b8e-b09e-a0b44248d66b-kube-api-access-x58f7\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.941894 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.941996 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.942125 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-config-data\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.942199 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea51a4c-c5eb-414d-a46b-cc704ff34914-logs\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.942369 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-config-data\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.942579 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56sls\" (UniqueName: \"kubernetes.io/projected/eea51a4c-c5eb-414d-a46b-cc704ff34914-kube-api-access-56sls\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.942704 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971dce2f-fa89-4b8e-b09e-a0b44248d66b-logs\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.944042 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea51a4c-c5eb-414d-a46b-cc704ff34914-logs\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.961154 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.967227 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.969060 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-config-data\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:27 crc kubenswrapper[4918]: I0319 17:02:27.983147 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56sls\" (UniqueName: \"kubernetes.io/projected/eea51a4c-c5eb-414d-a46b-cc704ff34914-kube-api-access-56sls\") pod \"nova-api-0\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " pod="openstack/nova-api-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.019119 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.020472 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.031284 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.032006 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.038227 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.045601 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x58f7\" (UniqueName: \"kubernetes.io/projected/971dce2f-fa89-4b8e-b09e-a0b44248d66b-kube-api-access-x58f7\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.045649 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.045700 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-config-data\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.046712 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971dce2f-fa89-4b8e-b09e-a0b44248d66b-logs\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.047441 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971dce2f-fa89-4b8e-b09e-a0b44248d66b-logs\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.047724 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.048976 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.055595 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.063299 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-config-data\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.065942 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.077276 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.085139 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x58f7\" (UniqueName: \"kubernetes.io/projected/971dce2f-fa89-4b8e-b09e-a0b44248d66b-kube-api-access-x58f7\") pod \"nova-metadata-0\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " pod="openstack/nova-metadata-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.097543 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-mfkf8"] Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.100353 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.113128 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-mfkf8"] Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153423 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153463 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtjcj\" (UniqueName: \"kubernetes.io/projected/436a6713-1e4a-474f-80e8-793f725561da-kube-api-access-gtjcj\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153555 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-config-data\") pod \"nova-scheduler-0\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153573 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153606 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-config\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153649 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153716 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153751 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153793 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-svc\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153836 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153888 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc95b\" (UniqueName: \"kubernetes.io/projected/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-kube-api-access-cc95b\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.153938 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gfn2\" (UniqueName: \"kubernetes.io/projected/4bd4b65c-9a93-42a1-a838-fde61211037f-kube-api-access-8gfn2\") pod \"nova-scheduler-0\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.211978 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.212270 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.255917 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gfn2\" (UniqueName: \"kubernetes.io/projected/4bd4b65c-9a93-42a1-a838-fde61211037f-kube-api-access-8gfn2\") pod \"nova-scheduler-0\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.255998 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.256022 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtjcj\" (UniqueName: \"kubernetes.io/projected/436a6713-1e4a-474f-80e8-793f725561da-kube-api-access-gtjcj\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.256045 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-config-data\") pod \"nova-scheduler-0\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.256061 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.256079 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-config\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.256114 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.256155 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.256169 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.256206 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-svc\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.256228 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.256278 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc95b\" (UniqueName: \"kubernetes.io/projected/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-kube-api-access-cc95b\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.259118 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.259659 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.260163 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-svc\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.260680 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.273105 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.278310 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc95b\" (UniqueName: \"kubernetes.io/projected/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-kube-api-access-cc95b\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.279092 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gfn2\" (UniqueName: \"kubernetes.io/projected/4bd4b65c-9a93-42a1-a838-fde61211037f-kube-api-access-8gfn2\") pod \"nova-scheduler-0\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.280163 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.280915 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-config\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.288752 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-config-data\") pod \"nova-scheduler-0\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.289307 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtjcj\" (UniqueName: \"kubernetes.io/projected/436a6713-1e4a-474f-80e8-793f725561da-kube-api-access-gtjcj\") pod \"dnsmasq-dns-78cd565959-mfkf8\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.290055 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.364815 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.412006 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.452742 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.511973 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.629883 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-nkvm2"] Mar 19 17:02:28 crc kubenswrapper[4918]: I0319 17:02:28.854312 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.308098 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:29 crc kubenswrapper[4918]: W0319 17:02:29.322872 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod971dce2f_fa89_4b8e_b09e_a0b44248d66b.slice/crio-24d0bca1608e9835469de0f57c35f3f78ece6cffca579e590903de8a8c8af612 WatchSource:0}: Error finding container 24d0bca1608e9835469de0f57c35f3f78ece6cffca579e590903de8a8c8af612: Status 404 returned error can't find the container with id 24d0bca1608e9835469de0f57c35f3f78ece6cffca579e590903de8a8c8af612 Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.500638 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xjm4z"] Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.502383 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.508475 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.508684 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.524815 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xjm4z"] Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.590495 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nkvm2" event={"ID":"780adca3-416f-431d-915e-7a546cfeae43","Type":"ContainerStarted","Data":"7b11558f97432b5091ce795b44cd33d301bd98634d90250482b412f94899f255"} Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.592070 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nkvm2" event={"ID":"780adca3-416f-431d-915e-7a546cfeae43","Type":"ContainerStarted","Data":"fea50192a81c68448dca8d7fb78213af262af87ef514ca6000913ee7f1575926"} Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.611843 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eea51a4c-c5eb-414d-a46b-cc704ff34914","Type":"ContainerStarted","Data":"9bb089b2474ddbe5dc51d1fa5beeaf5891a6ff83c69f82b6d3eea38f4b722428"} Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.614102 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x8tj\" (UniqueName: \"kubernetes.io/projected/c4db6685-8155-4a95-af9f-3292270736d8-kube-api-access-2x8tj\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.614232 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-config-data\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.614347 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-scripts\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.615406 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.617743 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"355bfabe-8824-4ee1-9bf5-8c808a15fa70","Type":"ContainerStarted","Data":"a13c72570122bbc97a9092eaa5200338f30c37b9c161c6bd8ed94e2986bf37bd"} Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.618339 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.632889 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-nkvm2" podStartSLOduration=2.632869582 podStartE2EDuration="2.632869582s" podCreationTimestamp="2026-03-19 17:02:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:02:29.612138296 +0000 UTC m=+1361.734337534" watchObservedRunningTime="2026-03-19 17:02:29.632869582 +0000 UTC m=+1361.755068830" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.636007 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"971dce2f-fa89-4b8e-b09e-a0b44248d66b","Type":"ContainerStarted","Data":"24d0bca1608e9835469de0f57c35f3f78ece6cffca579e590903de8a8c8af612"} Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.720621 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.720954 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x8tj\" (UniqueName: \"kubernetes.io/projected/c4db6685-8155-4a95-af9f-3292270736d8-kube-api-access-2x8tj\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.721078 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-config-data\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.721215 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-scripts\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.729542 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.486452022 podStartE2EDuration="7.729518392s" podCreationTimestamp="2026-03-19 17:02:22 +0000 UTC" firstStartedPulling="2026-03-19 17:02:23.586812785 +0000 UTC m=+1355.709012023" lastFinishedPulling="2026-03-19 17:02:28.829879145 +0000 UTC m=+1360.952078393" observedRunningTime="2026-03-19 17:02:29.662408159 +0000 UTC m=+1361.784607407" watchObservedRunningTime="2026-03-19 17:02:29.729518392 +0000 UTC m=+1361.851717640" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.743203 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-scripts\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.746739 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-config-data\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.762782 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.763231 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x8tj\" (UniqueName: \"kubernetes.io/projected/c4db6685-8155-4a95-af9f-3292270736d8-kube-api-access-2x8tj\") pod \"nova-cell1-conductor-db-sync-xjm4z\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.794980 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.812675 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 17:02:29 crc kubenswrapper[4918]: I0319 17:02:29.839167 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:30 crc kubenswrapper[4918]: I0319 17:02:30.051752 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-mfkf8"] Mar 19 17:02:30 crc kubenswrapper[4918]: I0319 17:02:30.492131 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xjm4z"] Mar 19 17:02:30 crc kubenswrapper[4918]: I0319 17:02:30.659483 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bd4b65c-9a93-42a1-a838-fde61211037f","Type":"ContainerStarted","Data":"8b2fd9d4358eeffcd283c37b38eb5a32834f946af37c24a3259f4125a7cfaadc"} Mar 19 17:02:30 crc kubenswrapper[4918]: I0319 17:02:30.662004 4918 generic.go:334] "Generic (PLEG): container finished" podID="436a6713-1e4a-474f-80e8-793f725561da" containerID="9e5ff868ec777b6f5031bee3735d3ce46037cddcb23729ae2221942d8ccbba37" exitCode=0 Mar 19 17:02:30 crc kubenswrapper[4918]: I0319 17:02:30.662099 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" event={"ID":"436a6713-1e4a-474f-80e8-793f725561da","Type":"ContainerDied","Data":"9e5ff868ec777b6f5031bee3735d3ce46037cddcb23729ae2221942d8ccbba37"} Mar 19 17:02:30 crc kubenswrapper[4918]: I0319 17:02:30.662149 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" event={"ID":"436a6713-1e4a-474f-80e8-793f725561da","Type":"ContainerStarted","Data":"e8da49ea1e8ef48a248b25f4098e21909075bc5bf67e669a10c47fd01b3355c4"} Mar 19 17:02:30 crc kubenswrapper[4918]: I0319 17:02:30.668859 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xjm4z" event={"ID":"c4db6685-8155-4a95-af9f-3292270736d8","Type":"ContainerStarted","Data":"77fa06fbfbb5cd3023f48357c10c957c6a0b97bac709832d7e2ef9beaf275339"} Mar 19 17:02:30 crc kubenswrapper[4918]: I0319 17:02:30.674365 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4719eb1-a5e8-4e0e-a321-77cea020b1e0","Type":"ContainerStarted","Data":"f98d109253c1ba79820e8190a2914000e0d58d279c18fd41a609b92f495e6876"} Mar 19 17:02:31 crc kubenswrapper[4918]: I0319 17:02:31.699789 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" event={"ID":"436a6713-1e4a-474f-80e8-793f725561da","Type":"ContainerStarted","Data":"c37c634d9828512c0eb66601822616af02d77d6cc36a2c1e1f94024918a2cd00"} Mar 19 17:02:31 crc kubenswrapper[4918]: I0319 17:02:31.700235 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:31 crc kubenswrapper[4918]: I0319 17:02:31.712395 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xjm4z" event={"ID":"c4db6685-8155-4a95-af9f-3292270736d8","Type":"ContainerStarted","Data":"9b9d34fa71438f864c8513534831a69723212935ac49c153660d3e267d8c5b60"} Mar 19 17:02:31 crc kubenswrapper[4918]: I0319 17:02:31.724520 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" podStartSLOduration=4.724503137 podStartE2EDuration="4.724503137s" podCreationTimestamp="2026-03-19 17:02:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:02:31.718540443 +0000 UTC m=+1363.840739701" watchObservedRunningTime="2026-03-19 17:02:31.724503137 +0000 UTC m=+1363.846702385" Mar 19 17:02:31 crc kubenswrapper[4918]: I0319 17:02:31.755813 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xjm4z" podStartSLOduration=2.755794081 podStartE2EDuration="2.755794081s" podCreationTimestamp="2026-03-19 17:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:02:31.751321809 +0000 UTC m=+1363.873521057" watchObservedRunningTime="2026-03-19 17:02:31.755794081 +0000 UTC m=+1363.877993329" Mar 19 17:02:31 crc kubenswrapper[4918]: I0319 17:02:31.826427 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:31 crc kubenswrapper[4918]: I0319 17:02:31.848182 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.750647 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4719eb1-a5e8-4e0e-a321-77cea020b1e0","Type":"ContainerStarted","Data":"e14e932b8a78564d08d06cb35443b0030c8fe1b01cbe2edc73b244766fee9b39"} Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.750801 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b4719eb1-a5e8-4e0e-a321-77cea020b1e0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e14e932b8a78564d08d06cb35443b0030c8fe1b01cbe2edc73b244766fee9b39" gracePeriod=30 Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.756484 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eea51a4c-c5eb-414d-a46b-cc704ff34914","Type":"ContainerStarted","Data":"678155dca62f3af94cd47aa29260a2c250c8de02f4635f4eb3dd53f1edc5dc99"} Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.756554 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eea51a4c-c5eb-414d-a46b-cc704ff34914","Type":"ContainerStarted","Data":"e4b1deb5f9ec17d88f8faeca43bebdc0f63a5c169dffa055025e565417423a6e"} Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.759085 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bd4b65c-9a93-42a1-a838-fde61211037f","Type":"ContainerStarted","Data":"1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4"} Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.762370 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"971dce2f-fa89-4b8e-b09e-a0b44248d66b","Type":"ContainerStarted","Data":"d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc"} Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.762420 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"971dce2f-fa89-4b8e-b09e-a0b44248d66b","Type":"ContainerStarted","Data":"aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84"} Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.762564 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="971dce2f-fa89-4b8e-b09e-a0b44248d66b" containerName="nova-metadata-log" containerID="cri-o://aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84" gracePeriod=30 Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.762614 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="971dce2f-fa89-4b8e-b09e-a0b44248d66b" containerName="nova-metadata-metadata" containerID="cri-o://d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc" gracePeriod=30 Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.774098 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.042297172 podStartE2EDuration="8.774071305s" podCreationTimestamp="2026-03-19 17:02:27 +0000 UTC" firstStartedPulling="2026-03-19 17:02:29.720940588 +0000 UTC m=+1361.843139836" lastFinishedPulling="2026-03-19 17:02:34.452714721 +0000 UTC m=+1366.574913969" observedRunningTime="2026-03-19 17:02:35.769349415 +0000 UTC m=+1367.891548673" watchObservedRunningTime="2026-03-19 17:02:35.774071305 +0000 UTC m=+1367.896270563" Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.795768 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.081697028 podStartE2EDuration="8.795746156s" podCreationTimestamp="2026-03-19 17:02:27 +0000 UTC" firstStartedPulling="2026-03-19 17:02:29.738694433 +0000 UTC m=+1361.860893671" lastFinishedPulling="2026-03-19 17:02:34.452743551 +0000 UTC m=+1366.574942799" observedRunningTime="2026-03-19 17:02:35.791553682 +0000 UTC m=+1367.913752940" watchObservedRunningTime="2026-03-19 17:02:35.795746156 +0000 UTC m=+1367.917945404" Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.817643 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.27549352 podStartE2EDuration="8.817621354s" podCreationTimestamp="2026-03-19 17:02:27 +0000 UTC" firstStartedPulling="2026-03-19 17:02:28.908959332 +0000 UTC m=+1361.031158580" lastFinishedPulling="2026-03-19 17:02:34.451087166 +0000 UTC m=+1366.573286414" observedRunningTime="2026-03-19 17:02:35.808904376 +0000 UTC m=+1367.931103624" watchObservedRunningTime="2026-03-19 17:02:35.817621354 +0000 UTC m=+1367.939820622" Mar 19 17:02:35 crc kubenswrapper[4918]: I0319 17:02:35.839269 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.713584433 podStartE2EDuration="8.839243424s" podCreationTimestamp="2026-03-19 17:02:27 +0000 UTC" firstStartedPulling="2026-03-19 17:02:29.325918178 +0000 UTC m=+1361.448117416" lastFinishedPulling="2026-03-19 17:02:34.451577169 +0000 UTC m=+1366.573776407" observedRunningTime="2026-03-19 17:02:35.828417909 +0000 UTC m=+1367.950617167" watchObservedRunningTime="2026-03-19 17:02:35.839243424 +0000 UTC m=+1367.961442682" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.746404 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.771851 4918 generic.go:334] "Generic (PLEG): container finished" podID="971dce2f-fa89-4b8e-b09e-a0b44248d66b" containerID="d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc" exitCode=0 Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.771881 4918 generic.go:334] "Generic (PLEG): container finished" podID="971dce2f-fa89-4b8e-b09e-a0b44248d66b" containerID="aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84" exitCode=143 Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.771914 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.771918 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"971dce2f-fa89-4b8e-b09e-a0b44248d66b","Type":"ContainerDied","Data":"d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc"} Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.771980 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"971dce2f-fa89-4b8e-b09e-a0b44248d66b","Type":"ContainerDied","Data":"aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84"} Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.772000 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"971dce2f-fa89-4b8e-b09e-a0b44248d66b","Type":"ContainerDied","Data":"24d0bca1608e9835469de0f57c35f3f78ece6cffca579e590903de8a8c8af612"} Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.772024 4918 scope.go:117] "RemoveContainer" containerID="d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.788331 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-combined-ca-bundle\") pod \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.788407 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-config-data\") pod \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.788573 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x58f7\" (UniqueName: \"kubernetes.io/projected/971dce2f-fa89-4b8e-b09e-a0b44248d66b-kube-api-access-x58f7\") pod \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.788652 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971dce2f-fa89-4b8e-b09e-a0b44248d66b-logs\") pod \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\" (UID: \"971dce2f-fa89-4b8e-b09e-a0b44248d66b\") " Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.789103 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/971dce2f-fa89-4b8e-b09e-a0b44248d66b-logs" (OuterVolumeSpecName: "logs") pod "971dce2f-fa89-4b8e-b09e-a0b44248d66b" (UID: "971dce2f-fa89-4b8e-b09e-a0b44248d66b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.789379 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971dce2f-fa89-4b8e-b09e-a0b44248d66b-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.794113 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/971dce2f-fa89-4b8e-b09e-a0b44248d66b-kube-api-access-x58f7" (OuterVolumeSpecName: "kube-api-access-x58f7") pod "971dce2f-fa89-4b8e-b09e-a0b44248d66b" (UID: "971dce2f-fa89-4b8e-b09e-a0b44248d66b"). InnerVolumeSpecName "kube-api-access-x58f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.795186 4918 scope.go:117] "RemoveContainer" containerID="aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.829212 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-config-data" (OuterVolumeSpecName: "config-data") pod "971dce2f-fa89-4b8e-b09e-a0b44248d66b" (UID: "971dce2f-fa89-4b8e-b09e-a0b44248d66b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.890996 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.891027 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x58f7\" (UniqueName: \"kubernetes.io/projected/971dce2f-fa89-4b8e-b09e-a0b44248d66b-kube-api-access-x58f7\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.891743 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "971dce2f-fa89-4b8e-b09e-a0b44248d66b" (UID: "971dce2f-fa89-4b8e-b09e-a0b44248d66b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:36 crc kubenswrapper[4918]: I0319 17:02:36.992844 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971dce2f-fa89-4b8e-b09e-a0b44248d66b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.021639 4918 scope.go:117] "RemoveContainer" containerID="d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc" Mar 19 17:02:37 crc kubenswrapper[4918]: E0319 17:02:37.022005 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc\": container with ID starting with d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc not found: ID does not exist" containerID="d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.022035 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc"} err="failed to get container status \"d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc\": rpc error: code = NotFound desc = could not find container \"d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc\": container with ID starting with d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc not found: ID does not exist" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.022055 4918 scope.go:117] "RemoveContainer" containerID="aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84" Mar 19 17:02:37 crc kubenswrapper[4918]: E0319 17:02:37.022384 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84\": container with ID starting with aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84 not found: ID does not exist" containerID="aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.022405 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84"} err="failed to get container status \"aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84\": rpc error: code = NotFound desc = could not find container \"aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84\": container with ID starting with aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84 not found: ID does not exist" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.022418 4918 scope.go:117] "RemoveContainer" containerID="d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.022656 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc"} err="failed to get container status \"d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc\": rpc error: code = NotFound desc = could not find container \"d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc\": container with ID starting with d3a32b683fbfce12a7b89cd6dbe30e2e35aa87ea7d50cdf58439a2872ed30cbc not found: ID does not exist" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.022683 4918 scope.go:117] "RemoveContainer" containerID="aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.022994 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84"} err="failed to get container status \"aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84\": rpc error: code = NotFound desc = could not find container \"aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84\": container with ID starting with aa09541cb9cb1224ea203b852b5ab78eac1967c3e6cbd89133614ebfdaa6bb84 not found: ID does not exist" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.113503 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.126405 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.139043 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:37 crc kubenswrapper[4918]: E0319 17:02:37.139565 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971dce2f-fa89-4b8e-b09e-a0b44248d66b" containerName="nova-metadata-log" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.139589 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="971dce2f-fa89-4b8e-b09e-a0b44248d66b" containerName="nova-metadata-log" Mar 19 17:02:37 crc kubenswrapper[4918]: E0319 17:02:37.139622 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971dce2f-fa89-4b8e-b09e-a0b44248d66b" containerName="nova-metadata-metadata" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.139630 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="971dce2f-fa89-4b8e-b09e-a0b44248d66b" containerName="nova-metadata-metadata" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.139870 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="971dce2f-fa89-4b8e-b09e-a0b44248d66b" containerName="nova-metadata-log" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.139888 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="971dce2f-fa89-4b8e-b09e-a0b44248d66b" containerName="nova-metadata-metadata" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.141304 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.143402 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.145862 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.157087 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.196649 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-config-data\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.196716 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.196848 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e4e01fd-260f-43c4-bbac-2a3c0d327070-logs\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.196873 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqzld\" (UniqueName: \"kubernetes.io/projected/7e4e01fd-260f-43c4-bbac-2a3c0d327070-kube-api-access-gqzld\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.196909 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.298738 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-config-data\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.298788 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.298893 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e4e01fd-260f-43c4-bbac-2a3c0d327070-logs\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.298913 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqzld\" (UniqueName: \"kubernetes.io/projected/7e4e01fd-260f-43c4-bbac-2a3c0d327070-kube-api-access-gqzld\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.298937 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.300223 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e4e01fd-260f-43c4-bbac-2a3c0d327070-logs\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.304899 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.304946 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-config-data\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.305225 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.316953 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqzld\" (UniqueName: \"kubernetes.io/projected/7e4e01fd-260f-43c4-bbac-2a3c0d327070-kube-api-access-gqzld\") pod \"nova-metadata-0\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.457317 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:02:37 crc kubenswrapper[4918]: I0319 17:02:37.972995 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.032175 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.032474 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.413852 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.414173 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.453617 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.459209 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.514633 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.605967 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="971dce2f-fa89-4b8e-b09e-a0b44248d66b" path="/var/lib/kubelet/pods/971dce2f-fa89-4b8e-b09e-a0b44248d66b/volumes" Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.606660 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-l85q9"] Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.606926 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" podUID="04163635-8e0b-4bf7-abf5-3504d0e391a8" containerName="dnsmasq-dns" containerID="cri-o://4c45522f586492cd2568e2d7c8ce8793fed76539da03d3301382bcf2a5feb13e" gracePeriod=10 Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.797994 4918 generic.go:334] "Generic (PLEG): container finished" podID="04163635-8e0b-4bf7-abf5-3504d0e391a8" containerID="4c45522f586492cd2568e2d7c8ce8793fed76539da03d3301382bcf2a5feb13e" exitCode=0 Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.798049 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" event={"ID":"04163635-8e0b-4bf7-abf5-3504d0e391a8","Type":"ContainerDied","Data":"4c45522f586492cd2568e2d7c8ce8793fed76539da03d3301382bcf2a5feb13e"} Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.800449 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e4e01fd-260f-43c4-bbac-2a3c0d327070","Type":"ContainerStarted","Data":"673a6dc80e67c739f7500e9491f6fb7d52f709175c55bc20908132441374cf43"} Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.800501 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e4e01fd-260f-43c4-bbac-2a3c0d327070","Type":"ContainerStarted","Data":"515c7765c77bdaf82385e5a44b02bdef063c98562ab2d54635831c5a862d2327"} Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.800513 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e4e01fd-260f-43c4-bbac-2a3c0d327070","Type":"ContainerStarted","Data":"1a0c47411fb2e5fb9b7b57c83c49957bcabbe9c153fb56894bdd969548f00c68"} Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.810596 4918 generic.go:334] "Generic (PLEG): container finished" podID="780adca3-416f-431d-915e-7a546cfeae43" containerID="7b11558f97432b5091ce795b44cd33d301bd98634d90250482b412f94899f255" exitCode=0 Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.810867 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nkvm2" event={"ID":"780adca3-416f-431d-915e-7a546cfeae43","Type":"ContainerDied","Data":"7b11558f97432b5091ce795b44cd33d301bd98634d90250482b412f94899f255"} Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.843356 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.843331814 podStartE2EDuration="1.843331814s" podCreationTimestamp="2026-03-19 17:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:02:38.830596366 +0000 UTC m=+1370.952795614" watchObservedRunningTime="2026-03-19 17:02:38.843331814 +0000 UTC m=+1370.965531062" Mar 19 17:02:38 crc kubenswrapper[4918]: I0319 17:02:38.857061 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.072708 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.219:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.114658 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.219:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.459156 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.555664 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-nb\") pod \"04163635-8e0b-4bf7-abf5-3504d0e391a8\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.555723 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-sb\") pod \"04163635-8e0b-4bf7-abf5-3504d0e391a8\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.555821 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-svc\") pod \"04163635-8e0b-4bf7-abf5-3504d0e391a8\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.555878 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-swift-storage-0\") pod \"04163635-8e0b-4bf7-abf5-3504d0e391a8\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.555972 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlsms\" (UniqueName: \"kubernetes.io/projected/04163635-8e0b-4bf7-abf5-3504d0e391a8-kube-api-access-dlsms\") pod \"04163635-8e0b-4bf7-abf5-3504d0e391a8\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.556000 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-config\") pod \"04163635-8e0b-4bf7-abf5-3504d0e391a8\" (UID: \"04163635-8e0b-4bf7-abf5-3504d0e391a8\") " Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.576973 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04163635-8e0b-4bf7-abf5-3504d0e391a8-kube-api-access-dlsms" (OuterVolumeSpecName: "kube-api-access-dlsms") pod "04163635-8e0b-4bf7-abf5-3504d0e391a8" (UID: "04163635-8e0b-4bf7-abf5-3504d0e391a8"). InnerVolumeSpecName "kube-api-access-dlsms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.615859 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04163635-8e0b-4bf7-abf5-3504d0e391a8" (UID: "04163635-8e0b-4bf7-abf5-3504d0e391a8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.631984 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04163635-8e0b-4bf7-abf5-3504d0e391a8" (UID: "04163635-8e0b-4bf7-abf5-3504d0e391a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.636047 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04163635-8e0b-4bf7-abf5-3504d0e391a8" (UID: "04163635-8e0b-4bf7-abf5-3504d0e391a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.640289 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04163635-8e0b-4bf7-abf5-3504d0e391a8" (UID: "04163635-8e0b-4bf7-abf5-3504d0e391a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.652381 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-config" (OuterVolumeSpecName: "config") pod "04163635-8e0b-4bf7-abf5-3504d0e391a8" (UID: "04163635-8e0b-4bf7-abf5-3504d0e391a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.658222 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlsms\" (UniqueName: \"kubernetes.io/projected/04163635-8e0b-4bf7-abf5-3504d0e391a8-kube-api-access-dlsms\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.658261 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.658275 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.658285 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.658298 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.658309 4918 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04163635-8e0b-4bf7-abf5-3504d0e391a8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.826555 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" event={"ID":"04163635-8e0b-4bf7-abf5-3504d0e391a8","Type":"ContainerDied","Data":"149c32acb9b9918f417eb460413778840c8e230113e72b6bd42202f1620b25bf"} Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.826635 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-l85q9" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.826643 4918 scope.go:117] "RemoveContainer" containerID="4c45522f586492cd2568e2d7c8ce8793fed76539da03d3301382bcf2a5feb13e" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.879343 4918 scope.go:117] "RemoveContainer" containerID="aca1fac95bd48f10c168b69e293e5b406f71eb72d9b1b0b22c1b0f3e325e2817" Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.895502 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-l85q9"] Mar 19 17:02:39 crc kubenswrapper[4918]: I0319 17:02:39.905759 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-l85q9"] Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.526014 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.577141 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-combined-ca-bundle\") pod \"780adca3-416f-431d-915e-7a546cfeae43\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.577337 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-scripts\") pod \"780adca3-416f-431d-915e-7a546cfeae43\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.577489 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-config-data\") pod \"780adca3-416f-431d-915e-7a546cfeae43\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.577542 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkwfk\" (UniqueName: \"kubernetes.io/projected/780adca3-416f-431d-915e-7a546cfeae43-kube-api-access-hkwfk\") pod \"780adca3-416f-431d-915e-7a546cfeae43\" (UID: \"780adca3-416f-431d-915e-7a546cfeae43\") " Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.582689 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-scripts" (OuterVolumeSpecName: "scripts") pod "780adca3-416f-431d-915e-7a546cfeae43" (UID: "780adca3-416f-431d-915e-7a546cfeae43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.584374 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/780adca3-416f-431d-915e-7a546cfeae43-kube-api-access-hkwfk" (OuterVolumeSpecName: "kube-api-access-hkwfk") pod "780adca3-416f-431d-915e-7a546cfeae43" (UID: "780adca3-416f-431d-915e-7a546cfeae43"). InnerVolumeSpecName "kube-api-access-hkwfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.604976 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04163635-8e0b-4bf7-abf5-3504d0e391a8" path="/var/lib/kubelet/pods/04163635-8e0b-4bf7-abf5-3504d0e391a8/volumes" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.622487 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "780adca3-416f-431d-915e-7a546cfeae43" (UID: "780adca3-416f-431d-915e-7a546cfeae43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.626027 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-config-data" (OuterVolumeSpecName: "config-data") pod "780adca3-416f-431d-915e-7a546cfeae43" (UID: "780adca3-416f-431d-915e-7a546cfeae43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.680752 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.680922 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.680999 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780adca3-416f-431d-915e-7a546cfeae43-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.681060 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkwfk\" (UniqueName: \"kubernetes.io/projected/780adca3-416f-431d-915e-7a546cfeae43-kube-api-access-hkwfk\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.862362 4918 generic.go:334] "Generic (PLEG): container finished" podID="c4db6685-8155-4a95-af9f-3292270736d8" containerID="9b9d34fa71438f864c8513534831a69723212935ac49c153660d3e267d8c5b60" exitCode=0 Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.862446 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xjm4z" event={"ID":"c4db6685-8155-4a95-af9f-3292270736d8","Type":"ContainerDied","Data":"9b9d34fa71438f864c8513534831a69723212935ac49c153660d3e267d8c5b60"} Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.896454 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nkvm2" event={"ID":"780adca3-416f-431d-915e-7a546cfeae43","Type":"ContainerDied","Data":"fea50192a81c68448dca8d7fb78213af262af87ef514ca6000913ee7f1575926"} Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.896499 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fea50192a81c68448dca8d7fb78213af262af87ef514ca6000913ee7f1575926" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.896521 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nkvm2" Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.977171 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.979127 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerName="nova-api-api" containerID="cri-o://678155dca62f3af94cd47aa29260a2c250c8de02f4635f4eb3dd53f1edc5dc99" gracePeriod=30 Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.979639 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerName="nova-api-log" containerID="cri-o://e4b1deb5f9ec17d88f8faeca43bebdc0f63a5c169dffa055025e565417423a6e" gracePeriod=30 Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.990588 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:02:40 crc kubenswrapper[4918]: I0319 17:02:40.990761 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4bd4b65c-9a93-42a1-a838-fde61211037f" containerName="nova-scheduler-scheduler" containerID="cri-o://1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4" gracePeriod=30 Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.049708 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.050214 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7e4e01fd-260f-43c4-bbac-2a3c0d327070" containerName="nova-metadata-log" containerID="cri-o://515c7765c77bdaf82385e5a44b02bdef063c98562ab2d54635831c5a862d2327" gracePeriod=30 Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.050758 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7e4e01fd-260f-43c4-bbac-2a3c0d327070" containerName="nova-metadata-metadata" containerID="cri-o://673a6dc80e67c739f7500e9491f6fb7d52f709175c55bc20908132441374cf43" gracePeriod=30 Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.920459 4918 generic.go:334] "Generic (PLEG): container finished" podID="7e4e01fd-260f-43c4-bbac-2a3c0d327070" containerID="673a6dc80e67c739f7500e9491f6fb7d52f709175c55bc20908132441374cf43" exitCode=0 Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.922285 4918 generic.go:334] "Generic (PLEG): container finished" podID="7e4e01fd-260f-43c4-bbac-2a3c0d327070" containerID="515c7765c77bdaf82385e5a44b02bdef063c98562ab2d54635831c5a862d2327" exitCode=143 Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.920511 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e4e01fd-260f-43c4-bbac-2a3c0d327070","Type":"ContainerDied","Data":"673a6dc80e67c739f7500e9491f6fb7d52f709175c55bc20908132441374cf43"} Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.922462 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e4e01fd-260f-43c4-bbac-2a3c0d327070","Type":"ContainerDied","Data":"515c7765c77bdaf82385e5a44b02bdef063c98562ab2d54635831c5a862d2327"} Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.922480 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7e4e01fd-260f-43c4-bbac-2a3c0d327070","Type":"ContainerDied","Data":"1a0c47411fb2e5fb9b7b57c83c49957bcabbe9c153fb56894bdd969548f00c68"} Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.922504 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a0c47411fb2e5fb9b7b57c83c49957bcabbe9c153fb56894bdd969548f00c68" Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.926165 4918 generic.go:334] "Generic (PLEG): container finished" podID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerID="e4b1deb5f9ec17d88f8faeca43bebdc0f63a5c169dffa055025e565417423a6e" exitCode=143 Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.926307 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eea51a4c-c5eb-414d-a46b-cc704ff34914","Type":"ContainerDied","Data":"e4b1deb5f9ec17d88f8faeca43bebdc0f63a5c169dffa055025e565417423a6e"} Mar 19 17:02:41 crc kubenswrapper[4918]: I0319 17:02:41.961743 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.011423 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-config-data\") pod \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.011580 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e4e01fd-260f-43c4-bbac-2a3c0d327070-logs\") pod \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.011666 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-nova-metadata-tls-certs\") pod \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.011741 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-combined-ca-bundle\") pod \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.011768 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqzld\" (UniqueName: \"kubernetes.io/projected/7e4e01fd-260f-43c4-bbac-2a3c0d327070-kube-api-access-gqzld\") pod \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\" (UID: \"7e4e01fd-260f-43c4-bbac-2a3c0d327070\") " Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.012966 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e4e01fd-260f-43c4-bbac-2a3c0d327070-logs" (OuterVolumeSpecName: "logs") pod "7e4e01fd-260f-43c4-bbac-2a3c0d327070" (UID: "7e4e01fd-260f-43c4-bbac-2a3c0d327070"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.030242 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e4e01fd-260f-43c4-bbac-2a3c0d327070-kube-api-access-gqzld" (OuterVolumeSpecName: "kube-api-access-gqzld") pod "7e4e01fd-260f-43c4-bbac-2a3c0d327070" (UID: "7e4e01fd-260f-43c4-bbac-2a3c0d327070"). InnerVolumeSpecName "kube-api-access-gqzld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.094351 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-config-data" (OuterVolumeSpecName: "config-data") pod "7e4e01fd-260f-43c4-bbac-2a3c0d327070" (UID: "7e4e01fd-260f-43c4-bbac-2a3c0d327070"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.112262 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7e4e01fd-260f-43c4-bbac-2a3c0d327070" (UID: "7e4e01fd-260f-43c4-bbac-2a3c0d327070"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.112313 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e4e01fd-260f-43c4-bbac-2a3c0d327070" (UID: "7e4e01fd-260f-43c4-bbac-2a3c0d327070"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.114685 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e4e01fd-260f-43c4-bbac-2a3c0d327070-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.114709 4918 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.114719 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.114728 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqzld\" (UniqueName: \"kubernetes.io/projected/7e4e01fd-260f-43c4-bbac-2a3c0d327070-kube-api-access-gqzld\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.114737 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e4e01fd-260f-43c4-bbac-2a3c0d327070-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.626110 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.724250 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x8tj\" (UniqueName: \"kubernetes.io/projected/c4db6685-8155-4a95-af9f-3292270736d8-kube-api-access-2x8tj\") pod \"c4db6685-8155-4a95-af9f-3292270736d8\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.724306 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-config-data\") pod \"c4db6685-8155-4a95-af9f-3292270736d8\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.724366 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-scripts\") pod \"c4db6685-8155-4a95-af9f-3292270736d8\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.732714 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-scripts" (OuterVolumeSpecName: "scripts") pod "c4db6685-8155-4a95-af9f-3292270736d8" (UID: "c4db6685-8155-4a95-af9f-3292270736d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.732731 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4db6685-8155-4a95-af9f-3292270736d8-kube-api-access-2x8tj" (OuterVolumeSpecName: "kube-api-access-2x8tj") pod "c4db6685-8155-4a95-af9f-3292270736d8" (UID: "c4db6685-8155-4a95-af9f-3292270736d8"). InnerVolumeSpecName "kube-api-access-2x8tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.763063 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-config-data" (OuterVolumeSpecName: "config-data") pod "c4db6685-8155-4a95-af9f-3292270736d8" (UID: "c4db6685-8155-4a95-af9f-3292270736d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.826266 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-combined-ca-bundle\") pod \"c4db6685-8155-4a95-af9f-3292270736d8\" (UID: \"c4db6685-8155-4a95-af9f-3292270736d8\") " Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.827074 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x8tj\" (UniqueName: \"kubernetes.io/projected/c4db6685-8155-4a95-af9f-3292270736d8-kube-api-access-2x8tj\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.827095 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.827104 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.857157 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4db6685-8155-4a95-af9f-3292270736d8" (UID: "c4db6685-8155-4a95-af9f-3292270736d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.933106 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db6685-8155-4a95-af9f-3292270736d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.938227 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.938970 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xjm4z" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.938955 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xjm4z" event={"ID":"c4db6685-8155-4a95-af9f-3292270736d8","Type":"ContainerDied","Data":"77fa06fbfbb5cd3023f48357c10c957c6a0b97bac709832d7e2ef9beaf275339"} Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.939151 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77fa06fbfbb5cd3023f48357c10c957c6a0b97bac709832d7e2ef9beaf275339" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.985275 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 17:02:42 crc kubenswrapper[4918]: E0319 17:02:42.985770 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4e01fd-260f-43c4-bbac-2a3c0d327070" containerName="nova-metadata-log" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.985788 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4e01fd-260f-43c4-bbac-2a3c0d327070" containerName="nova-metadata-log" Mar 19 17:02:42 crc kubenswrapper[4918]: E0319 17:02:42.985802 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4e01fd-260f-43c4-bbac-2a3c0d327070" containerName="nova-metadata-metadata" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.985808 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4e01fd-260f-43c4-bbac-2a3c0d327070" containerName="nova-metadata-metadata" Mar 19 17:02:42 crc kubenswrapper[4918]: E0319 17:02:42.985823 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04163635-8e0b-4bf7-abf5-3504d0e391a8" containerName="dnsmasq-dns" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.985829 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="04163635-8e0b-4bf7-abf5-3504d0e391a8" containerName="dnsmasq-dns" Mar 19 17:02:42 crc kubenswrapper[4918]: E0319 17:02:42.985848 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="780adca3-416f-431d-915e-7a546cfeae43" containerName="nova-manage" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.985854 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="780adca3-416f-431d-915e-7a546cfeae43" containerName="nova-manage" Mar 19 17:02:42 crc kubenswrapper[4918]: E0319 17:02:42.985863 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04163635-8e0b-4bf7-abf5-3504d0e391a8" containerName="init" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.985869 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="04163635-8e0b-4bf7-abf5-3504d0e391a8" containerName="init" Mar 19 17:02:42 crc kubenswrapper[4918]: E0319 17:02:42.985893 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4db6685-8155-4a95-af9f-3292270736d8" containerName="nova-cell1-conductor-db-sync" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.985900 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4db6685-8155-4a95-af9f-3292270736d8" containerName="nova-cell1-conductor-db-sync" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.986078 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="780adca3-416f-431d-915e-7a546cfeae43" containerName="nova-manage" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.986094 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="04163635-8e0b-4bf7-abf5-3504d0e391a8" containerName="dnsmasq-dns" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.986106 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4db6685-8155-4a95-af9f-3292270736d8" containerName="nova-cell1-conductor-db-sync" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.986113 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e4e01fd-260f-43c4-bbac-2a3c0d327070" containerName="nova-metadata-log" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.986124 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e4e01fd-260f-43c4-bbac-2a3c0d327070" containerName="nova-metadata-metadata" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.988907 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:42 crc kubenswrapper[4918]: I0319 17:02:42.992644 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.002167 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.035388 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be298c7d-0f1b-44c9-ac1f-3e2accac7bdc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"be298c7d-0f1b-44c9-ac1f-3e2accac7bdc\") " pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.035580 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtmj\" (UniqueName: \"kubernetes.io/projected/be298c7d-0f1b-44c9-ac1f-3e2accac7bdc-kube-api-access-4wtmj\") pod \"nova-cell1-conductor-0\" (UID: \"be298c7d-0f1b-44c9-ac1f-3e2accac7bdc\") " pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.035802 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be298c7d-0f1b-44c9-ac1f-3e2accac7bdc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"be298c7d-0f1b-44c9-ac1f-3e2accac7bdc\") " pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.043092 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.056461 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.078797 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.080628 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.083302 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.083381 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.102340 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.143404 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kqxn\" (UniqueName: \"kubernetes.io/projected/b6c864e7-f676-4c0d-894b-c31c175fccd2-kube-api-access-2kqxn\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.144074 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtmj\" (UniqueName: \"kubernetes.io/projected/be298c7d-0f1b-44c9-ac1f-3e2accac7bdc-kube-api-access-4wtmj\") pod \"nova-cell1-conductor-0\" (UID: \"be298c7d-0f1b-44c9-ac1f-3e2accac7bdc\") " pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.144264 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-config-data\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.144360 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be298c7d-0f1b-44c9-ac1f-3e2accac7bdc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"be298c7d-0f1b-44c9-ac1f-3e2accac7bdc\") " pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.144466 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c864e7-f676-4c0d-894b-c31c175fccd2-logs\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.144545 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.144619 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.144679 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be298c7d-0f1b-44c9-ac1f-3e2accac7bdc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"be298c7d-0f1b-44c9-ac1f-3e2accac7bdc\") " pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.151715 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be298c7d-0f1b-44c9-ac1f-3e2accac7bdc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"be298c7d-0f1b-44c9-ac1f-3e2accac7bdc\") " pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.164888 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be298c7d-0f1b-44c9-ac1f-3e2accac7bdc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"be298c7d-0f1b-44c9-ac1f-3e2accac7bdc\") " pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.167657 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtmj\" (UniqueName: \"kubernetes.io/projected/be298c7d-0f1b-44c9-ac1f-3e2accac7bdc-kube-api-access-4wtmj\") pod \"nova-cell1-conductor-0\" (UID: \"be298c7d-0f1b-44c9-ac1f-3e2accac7bdc\") " pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.246581 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c864e7-f676-4c0d-894b-c31c175fccd2-logs\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.246682 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.246748 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.246842 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kqxn\" (UniqueName: \"kubernetes.io/projected/b6c864e7-f676-4c0d-894b-c31c175fccd2-kube-api-access-2kqxn\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.246952 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-config-data\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.248662 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c864e7-f676-4c0d-894b-c31c175fccd2-logs\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.253276 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.253569 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-config-data\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.256400 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.287980 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kqxn\" (UniqueName: \"kubernetes.io/projected/b6c864e7-f676-4c0d-894b-c31c175fccd2-kube-api-access-2kqxn\") pod \"nova-metadata-0\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.305490 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.401580 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:02:43 crc kubenswrapper[4918]: E0319 17:02:43.413881 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4 is running failed: container process not found" containerID="1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 17:02:43 crc kubenswrapper[4918]: E0319 17:02:43.414362 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4 is running failed: container process not found" containerID="1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 17:02:43 crc kubenswrapper[4918]: E0319 17:02:43.418338 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4 is running failed: container process not found" containerID="1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 17:02:43 crc kubenswrapper[4918]: E0319 17:02:43.418424 4918 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4bd4b65c-9a93-42a1-a838-fde61211037f" containerName="nova-scheduler-scheduler" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.949869 4918 generic.go:334] "Generic (PLEG): container finished" podID="4bd4b65c-9a93-42a1-a838-fde61211037f" containerID="1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4" exitCode=0 Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.949981 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bd4b65c-9a93-42a1-a838-fde61211037f","Type":"ContainerDied","Data":"1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4"} Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.950091 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4bd4b65c-9a93-42a1-a838-fde61211037f","Type":"ContainerDied","Data":"8b2fd9d4358eeffcd283c37b38eb5a32834f946af37c24a3259f4125a7cfaadc"} Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.950106 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b2fd9d4358eeffcd283c37b38eb5a32834f946af37c24a3259f4125a7cfaadc" Mar 19 17:02:43 crc kubenswrapper[4918]: I0319 17:02:43.963181 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.066130 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gfn2\" (UniqueName: \"kubernetes.io/projected/4bd4b65c-9a93-42a1-a838-fde61211037f-kube-api-access-8gfn2\") pod \"4bd4b65c-9a93-42a1-a838-fde61211037f\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.066265 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-config-data\") pod \"4bd4b65c-9a93-42a1-a838-fde61211037f\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.066304 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-combined-ca-bundle\") pod \"4bd4b65c-9a93-42a1-a838-fde61211037f\" (UID: \"4bd4b65c-9a93-42a1-a838-fde61211037f\") " Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.077734 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd4b65c-9a93-42a1-a838-fde61211037f-kube-api-access-8gfn2" (OuterVolumeSpecName: "kube-api-access-8gfn2") pod "4bd4b65c-9a93-42a1-a838-fde61211037f" (UID: "4bd4b65c-9a93-42a1-a838-fde61211037f"). InnerVolumeSpecName "kube-api-access-8gfn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.126797 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bd4b65c-9a93-42a1-a838-fde61211037f" (UID: "4bd4b65c-9a93-42a1-a838-fde61211037f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.133813 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-config-data" (OuterVolumeSpecName: "config-data") pod "4bd4b65c-9a93-42a1-a838-fde61211037f" (UID: "4bd4b65c-9a93-42a1-a838-fde61211037f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.175360 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gfn2\" (UniqueName: \"kubernetes.io/projected/4bd4b65c-9a93-42a1-a838-fde61211037f-kube-api-access-8gfn2\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.175395 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.175405 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd4b65c-9a93-42a1-a838-fde61211037f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.183546 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.204230 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.599805 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e4e01fd-260f-43c4-bbac-2a3c0d327070" path="/var/lib/kubelet/pods/7e4e01fd-260f-43c4-bbac-2a3c0d327070/volumes" Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.961053 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c864e7-f676-4c0d-894b-c31c175fccd2","Type":"ContainerStarted","Data":"8ffea604ee8670992f9c21a0358aa00f778c4ad7836fc1c4ce084b9c929d6415"} Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.961401 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c864e7-f676-4c0d-894b-c31c175fccd2","Type":"ContainerStarted","Data":"1f2a5b41eac6f7be33f5468c45045c789902f582422311f5dbe3dc455552bf73"} Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.961419 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c864e7-f676-4c0d-894b-c31c175fccd2","Type":"ContainerStarted","Data":"23199fbd7c22d1dc4e4d2ddcae5322966d9c310b55587c8526117d149618d677"} Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.964650 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"be298c7d-0f1b-44c9-ac1f-3e2accac7bdc","Type":"ContainerStarted","Data":"25018d20ae24ff29602b17f8d960a7780f5aff763a51377b8d691f79ca80a311"} Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.964679 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.964718 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"be298c7d-0f1b-44c9-ac1f-3e2accac7bdc","Type":"ContainerStarted","Data":"78d65ca1b5a7115211de5dc14b10f8d3e5745c564c9336e9350a3ff64f9c6f09"} Mar 19 17:02:44 crc kubenswrapper[4918]: I0319 17:02:44.988271 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.988250837 podStartE2EDuration="1.988250837s" podCreationTimestamp="2026-03-19 17:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:02:44.976835005 +0000 UTC m=+1377.099034253" watchObservedRunningTime="2026-03-19 17:02:44.988250837 +0000 UTC m=+1377.110450085" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.006638 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.022046 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.028155 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.028135756 podStartE2EDuration="3.028135756s" podCreationTimestamp="2026-03-19 17:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:02:45.012390416 +0000 UTC m=+1377.134589664" watchObservedRunningTime="2026-03-19 17:02:45.028135756 +0000 UTC m=+1377.150335004" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.044472 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:02:45 crc kubenswrapper[4918]: E0319 17:02:45.044998 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd4b65c-9a93-42a1-a838-fde61211037f" containerName="nova-scheduler-scheduler" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.045024 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd4b65c-9a93-42a1-a838-fde61211037f" containerName="nova-scheduler-scheduler" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.045272 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd4b65c-9a93-42a1-a838-fde61211037f" containerName="nova-scheduler-scheduler" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.046199 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.049974 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.053763 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.205493 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-config-data\") pod \"nova-scheduler-0\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.205563 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sklql\" (UniqueName: \"kubernetes.io/projected/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-kube-api-access-sklql\") pod \"nova-scheduler-0\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.205634 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.307692 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-config-data\") pod \"nova-scheduler-0\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.308056 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sklql\" (UniqueName: \"kubernetes.io/projected/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-kube-api-access-sklql\") pod \"nova-scheduler-0\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.308114 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.312481 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-config-data\") pod \"nova-scheduler-0\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.329161 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.329210 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sklql\" (UniqueName: \"kubernetes.io/projected/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-kube-api-access-sklql\") pod \"nova-scheduler-0\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " pod="openstack/nova-scheduler-0" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.429963 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.981638 4918 generic.go:334] "Generic (PLEG): container finished" podID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerID="678155dca62f3af94cd47aa29260a2c250c8de02f4635f4eb3dd53f1edc5dc99" exitCode=0 Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.981703 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eea51a4c-c5eb-414d-a46b-cc704ff34914","Type":"ContainerDied","Data":"678155dca62f3af94cd47aa29260a2c250c8de02f4635f4eb3dd53f1edc5dc99"} Mar 19 17:02:45 crc kubenswrapper[4918]: I0319 17:02:45.983055 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.032810 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.032864 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.282878 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.359717 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.432737 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea51a4c-c5eb-414d-a46b-cc704ff34914-logs\") pod \"eea51a4c-c5eb-414d-a46b-cc704ff34914\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.432822 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-config-data\") pod \"eea51a4c-c5eb-414d-a46b-cc704ff34914\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.432922 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56sls\" (UniqueName: \"kubernetes.io/projected/eea51a4c-c5eb-414d-a46b-cc704ff34914-kube-api-access-56sls\") pod \"eea51a4c-c5eb-414d-a46b-cc704ff34914\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.433025 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-combined-ca-bundle\") pod \"eea51a4c-c5eb-414d-a46b-cc704ff34914\" (UID: \"eea51a4c-c5eb-414d-a46b-cc704ff34914\") " Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.434496 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea51a4c-c5eb-414d-a46b-cc704ff34914-logs" (OuterVolumeSpecName: "logs") pod "eea51a4c-c5eb-414d-a46b-cc704ff34914" (UID: "eea51a4c-c5eb-414d-a46b-cc704ff34914"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.441682 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea51a4c-c5eb-414d-a46b-cc704ff34914-kube-api-access-56sls" (OuterVolumeSpecName: "kube-api-access-56sls") pod "eea51a4c-c5eb-414d-a46b-cc704ff34914" (UID: "eea51a4c-c5eb-414d-a46b-cc704ff34914"). InnerVolumeSpecName "kube-api-access-56sls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.461175 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eea51a4c-c5eb-414d-a46b-cc704ff34914" (UID: "eea51a4c-c5eb-414d-a46b-cc704ff34914"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.487196 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-config-data" (OuterVolumeSpecName: "config-data") pod "eea51a4c-c5eb-414d-a46b-cc704ff34914" (UID: "eea51a4c-c5eb-414d-a46b-cc704ff34914"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.536068 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.536110 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea51a4c-c5eb-414d-a46b-cc704ff34914-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.536124 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea51a4c-c5eb-414d-a46b-cc704ff34914-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.536135 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56sls\" (UniqueName: \"kubernetes.io/projected/eea51a4c-c5eb-414d-a46b-cc704ff34914-kube-api-access-56sls\") on node \"crc\" DevicePath \"\"" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.600182 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd4b65c-9a93-42a1-a838-fde61211037f" path="/var/lib/kubelet/pods/4bd4b65c-9a93-42a1-a838-fde61211037f/volumes" Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.998777 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7861f8ee-6769-49fa-b7ab-33d6ebb50eee","Type":"ContainerStarted","Data":"5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae"} Mar 19 17:02:46 crc kubenswrapper[4918]: I0319 17:02:46.999142 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7861f8ee-6769-49fa-b7ab-33d6ebb50eee","Type":"ContainerStarted","Data":"b58db43748df310e2b077fcecdcb25ba1b9b2fff34756a1fab9eebc4ffccb99f"} Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.002957 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.003435 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eea51a4c-c5eb-414d-a46b-cc704ff34914","Type":"ContainerDied","Data":"9bb089b2474ddbe5dc51d1fa5beeaf5891a6ff83c69f82b6d3eea38f4b722428"} Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.003470 4918 scope.go:117] "RemoveContainer" containerID="678155dca62f3af94cd47aa29260a2c250c8de02f4635f4eb3dd53f1edc5dc99" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.030644 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.030628386 podStartE2EDuration="2.030628386s" podCreationTimestamp="2026-03-19 17:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:02:47.014935627 +0000 UTC m=+1379.137134875" watchObservedRunningTime="2026-03-19 17:02:47.030628386 +0000 UTC m=+1379.152827634" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.064890 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.075022 4918 scope.go:117] "RemoveContainer" containerID="e4b1deb5f9ec17d88f8faeca43bebdc0f63a5c169dffa055025e565417423a6e" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.108598 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.121068 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 17:02:47 crc kubenswrapper[4918]: E0319 17:02:47.121554 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerName="nova-api-api" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.121568 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerName="nova-api-api" Mar 19 17:02:47 crc kubenswrapper[4918]: E0319 17:02:47.121614 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerName="nova-api-log" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.121620 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerName="nova-api-log" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.122236 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerName="nova-api-log" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.122266 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea51a4c-c5eb-414d-a46b-cc704ff34914" containerName="nova-api-api" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.123943 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.126052 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.133417 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.149178 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9af7f82f-d552-4383-8c74-6cec9b797315-logs\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.149229 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8jjv\" (UniqueName: \"kubernetes.io/projected/9af7f82f-d552-4383-8c74-6cec9b797315-kube-api-access-s8jjv\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.149258 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-config-data\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.149567 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.250807 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.250945 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9af7f82f-d552-4383-8c74-6cec9b797315-logs\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.250991 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8jjv\" (UniqueName: \"kubernetes.io/projected/9af7f82f-d552-4383-8c74-6cec9b797315-kube-api-access-s8jjv\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.251019 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-config-data\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.251851 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9af7f82f-d552-4383-8c74-6cec9b797315-logs\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.257631 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.268855 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-config-data\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.276206 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8jjv\" (UniqueName: \"kubernetes.io/projected/9af7f82f-d552-4383-8c74-6cec9b797315-kube-api-access-s8jjv\") pod \"nova-api-0\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.498110 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:02:47 crc kubenswrapper[4918]: I0319 17:02:47.957002 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:02:47 crc kubenswrapper[4918]: W0319 17:02:47.960624 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9af7f82f_d552_4383_8c74_6cec9b797315.slice/crio-9be60290bba021a4de5e8aebdd028ee7f5d2d88a3e6b7bf959f32eff538fb2f7 WatchSource:0}: Error finding container 9be60290bba021a4de5e8aebdd028ee7f5d2d88a3e6b7bf959f32eff538fb2f7: Status 404 returned error can't find the container with id 9be60290bba021a4de5e8aebdd028ee7f5d2d88a3e6b7bf959f32eff538fb2f7 Mar 19 17:02:48 crc kubenswrapper[4918]: I0319 17:02:48.012270 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9af7f82f-d552-4383-8c74-6cec9b797315","Type":"ContainerStarted","Data":"9be60290bba021a4de5e8aebdd028ee7f5d2d88a3e6b7bf959f32eff538fb2f7"} Mar 19 17:02:48 crc kubenswrapper[4918]: I0319 17:02:48.622028 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea51a4c-c5eb-414d-a46b-cc704ff34914" path="/var/lib/kubelet/pods/eea51a4c-c5eb-414d-a46b-cc704ff34914/volumes" Mar 19 17:02:49 crc kubenswrapper[4918]: I0319 17:02:49.026761 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9af7f82f-d552-4383-8c74-6cec9b797315","Type":"ContainerStarted","Data":"27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45"} Mar 19 17:02:49 crc kubenswrapper[4918]: I0319 17:02:49.026815 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9af7f82f-d552-4383-8c74-6cec9b797315","Type":"ContainerStarted","Data":"49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a"} Mar 19 17:02:49 crc kubenswrapper[4918]: I0319 17:02:49.045877 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.045857314 podStartE2EDuration="2.045857314s" podCreationTimestamp="2026-03-19 17:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:02:49.041486215 +0000 UTC m=+1381.163685463" watchObservedRunningTime="2026-03-19 17:02:49.045857314 +0000 UTC m=+1381.168056562" Mar 19 17:02:50 crc kubenswrapper[4918]: I0319 17:02:50.430342 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 17:02:53 crc kubenswrapper[4918]: I0319 17:02:53.126303 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 17:02:53 crc kubenswrapper[4918]: I0319 17:02:53.347312 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 19 17:02:53 crc kubenswrapper[4918]: I0319 17:02:53.402791 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 17:02:53 crc kubenswrapper[4918]: I0319 17:02:53.402854 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 17:02:54 crc kubenswrapper[4918]: I0319 17:02:54.413718 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 17:02:54 crc kubenswrapper[4918]: I0319 17:02:54.413725 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 17:02:55 crc kubenswrapper[4918]: I0319 17:02:55.430324 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 17:02:55 crc kubenswrapper[4918]: I0319 17:02:55.463393 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 17:02:56 crc kubenswrapper[4918]: I0319 17:02:56.132204 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 17:02:57 crc kubenswrapper[4918]: I0319 17:02:57.498409 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 17:02:57 crc kubenswrapper[4918]: I0319 17:02:57.499465 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 17:02:58 crc kubenswrapper[4918]: I0319 17:02:58.212023 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:02:58 crc kubenswrapper[4918]: I0319 17:02:58.212086 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:02:58 crc kubenswrapper[4918]: I0319 17:02:58.212136 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 17:02:58 crc kubenswrapper[4918]: I0319 17:02:58.213002 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c06f60b9e3990852ac6fc7b59da3fe3cda8e2a2ae81b8e586f6da8fc956569f8"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 17:02:58 crc kubenswrapper[4918]: I0319 17:02:58.213080 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://c06f60b9e3990852ac6fc7b59da3fe3cda8e2a2ae81b8e586f6da8fc956569f8" gracePeriod=600 Mar 19 17:02:58 crc kubenswrapper[4918]: I0319 17:02:58.581739 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9af7f82f-d552-4383-8c74-6cec9b797315" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.229:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 17:02:58 crc kubenswrapper[4918]: I0319 17:02:58.581778 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9af7f82f-d552-4383-8c74-6cec9b797315" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.229:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.131278 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="c06f60b9e3990852ac6fc7b59da3fe3cda8e2a2ae81b8e586f6da8fc956569f8" exitCode=0 Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.131329 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"c06f60b9e3990852ac6fc7b59da3fe3cda8e2a2ae81b8e586f6da8fc956569f8"} Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.131363 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440"} Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.131385 4918 scope.go:117] "RemoveContainer" containerID="d175bcf8fa8bff1bfb04d3a219eb7c4c6847a1adae22fbf62149bc4b8894f0f0" Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.693010 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nzvrq"] Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.695427 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.704973 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nzvrq"] Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.789057 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-catalog-content\") pod \"community-operators-nzvrq\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.789477 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzjf7\" (UniqueName: \"kubernetes.io/projected/2fa4107c-ea0f-4101-9fd7-e7914e076b81-kube-api-access-tzjf7\") pod \"community-operators-nzvrq\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.789829 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-utilities\") pod \"community-operators-nzvrq\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.892041 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-catalog-content\") pod \"community-operators-nzvrq\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.892146 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzjf7\" (UniqueName: \"kubernetes.io/projected/2fa4107c-ea0f-4101-9fd7-e7914e076b81-kube-api-access-tzjf7\") pod \"community-operators-nzvrq\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.892242 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-utilities\") pod \"community-operators-nzvrq\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.892679 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-utilities\") pod \"community-operators-nzvrq\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.892897 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-catalog-content\") pod \"community-operators-nzvrq\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:02:59 crc kubenswrapper[4918]: I0319 17:02:59.923913 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzjf7\" (UniqueName: \"kubernetes.io/projected/2fa4107c-ea0f-4101-9fd7-e7914e076b81-kube-api-access-tzjf7\") pod \"community-operators-nzvrq\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:03:00 crc kubenswrapper[4918]: I0319 17:03:00.020440 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:03:00 crc kubenswrapper[4918]: I0319 17:03:00.498877 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nzvrq"] Mar 19 17:03:00 crc kubenswrapper[4918]: I0319 17:03:00.781971 4918 scope.go:117] "RemoveContainer" containerID="fcbc594eafaba70bde98144b561c69eddfd531a5c6fba68f94194ebd03008415" Mar 19 17:03:00 crc kubenswrapper[4918]: I0319 17:03:00.891149 4918 scope.go:117] "RemoveContainer" containerID="0dd40ef54209db15302b074f3cc2825467077b5363dc9578c4d4d5af0929829c" Mar 19 17:03:01 crc kubenswrapper[4918]: I0319 17:03:01.170292 4918 generic.go:334] "Generic (PLEG): container finished" podID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" containerID="99012c0586e191a0d633b73064d766e8df52bb49efa0c125f56d41818c65041a" exitCode=0 Mar 19 17:03:01 crc kubenswrapper[4918]: I0319 17:03:01.170370 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzvrq" event={"ID":"2fa4107c-ea0f-4101-9fd7-e7914e076b81","Type":"ContainerDied","Data":"99012c0586e191a0d633b73064d766e8df52bb49efa0c125f56d41818c65041a"} Mar 19 17:03:01 crc kubenswrapper[4918]: I0319 17:03:01.170445 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzvrq" event={"ID":"2fa4107c-ea0f-4101-9fd7-e7914e076b81","Type":"ContainerStarted","Data":"32d266ecf1af7e4db412ae1a35e80f65d7cc83883b9c81c2b6960c7d1eaf036f"} Mar 19 17:03:01 crc kubenswrapper[4918]: I0319 17:03:01.172960 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 17:03:01 crc kubenswrapper[4918]: I0319 17:03:01.402998 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 17:03:01 crc kubenswrapper[4918]: I0319 17:03:01.403331 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 17:03:03 crc kubenswrapper[4918]: I0319 17:03:03.189903 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzvrq" event={"ID":"2fa4107c-ea0f-4101-9fd7-e7914e076b81","Type":"ContainerStarted","Data":"d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0"} Mar 19 17:03:03 crc kubenswrapper[4918]: I0319 17:03:03.410385 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 17:03:03 crc kubenswrapper[4918]: I0319 17:03:03.413419 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 17:03:03 crc kubenswrapper[4918]: I0319 17:03:03.418840 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 17:03:04 crc kubenswrapper[4918]: I0319 17:03:04.201402 4918 generic.go:334] "Generic (PLEG): container finished" podID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" containerID="d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0" exitCode=0 Mar 19 17:03:04 crc kubenswrapper[4918]: I0319 17:03:04.201511 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzvrq" event={"ID":"2fa4107c-ea0f-4101-9fd7-e7914e076b81","Type":"ContainerDied","Data":"d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0"} Mar 19 17:03:04 crc kubenswrapper[4918]: I0319 17:03:04.212131 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 17:03:05 crc kubenswrapper[4918]: I0319 17:03:05.215482 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzvrq" event={"ID":"2fa4107c-ea0f-4101-9fd7-e7914e076b81","Type":"ContainerStarted","Data":"19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f"} Mar 19 17:03:05 crc kubenswrapper[4918]: I0319 17:03:05.247030 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nzvrq" podStartSLOduration=2.487270731 podStartE2EDuration="6.247010951s" podCreationTimestamp="2026-03-19 17:02:59 +0000 UTC" firstStartedPulling="2026-03-19 17:03:01.172681297 +0000 UTC m=+1393.294880545" lastFinishedPulling="2026-03-19 17:03:04.932421517 +0000 UTC m=+1397.054620765" observedRunningTime="2026-03-19 17:03:05.236713569 +0000 UTC m=+1397.358912817" watchObservedRunningTime="2026-03-19 17:03:05.247010951 +0000 UTC m=+1397.369210199" Mar 19 17:03:05 crc kubenswrapper[4918]: I0319 17:03:05.499076 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 17:03:05 crc kubenswrapper[4918]: I0319 17:03:05.499144 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 17:03:06 crc kubenswrapper[4918]: I0319 17:03:06.223329 4918 generic.go:334] "Generic (PLEG): container finished" podID="b4719eb1-a5e8-4e0e-a321-77cea020b1e0" containerID="e14e932b8a78564d08d06cb35443b0030c8fe1b01cbe2edc73b244766fee9b39" exitCode=137 Mar 19 17:03:06 crc kubenswrapper[4918]: I0319 17:03:06.223421 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4719eb1-a5e8-4e0e-a321-77cea020b1e0","Type":"ContainerDied","Data":"e14e932b8a78564d08d06cb35443b0030c8fe1b01cbe2edc73b244766fee9b39"} Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.145667 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.235392 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4719eb1-a5e8-4e0e-a321-77cea020b1e0","Type":"ContainerDied","Data":"f98d109253c1ba79820e8190a2914000e0d58d279c18fd41a609b92f495e6876"} Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.235446 4918 scope.go:117] "RemoveContainer" containerID="e14e932b8a78564d08d06cb35443b0030c8fe1b01cbe2edc73b244766fee9b39" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.235462 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.276977 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-combined-ca-bundle\") pod \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.277148 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-config-data\") pod \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.277231 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc95b\" (UniqueName: \"kubernetes.io/projected/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-kube-api-access-cc95b\") pod \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\" (UID: \"b4719eb1-a5e8-4e0e-a321-77cea020b1e0\") " Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.304818 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-kube-api-access-cc95b" (OuterVolumeSpecName: "kube-api-access-cc95b") pod "b4719eb1-a5e8-4e0e-a321-77cea020b1e0" (UID: "b4719eb1-a5e8-4e0e-a321-77cea020b1e0"). InnerVolumeSpecName "kube-api-access-cc95b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.379979 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-config-data" (OuterVolumeSpecName: "config-data") pod "b4719eb1-a5e8-4e0e-a321-77cea020b1e0" (UID: "b4719eb1-a5e8-4e0e-a321-77cea020b1e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.381412 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc95b\" (UniqueName: \"kubernetes.io/projected/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-kube-api-access-cc95b\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.381441 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.386294 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4719eb1-a5e8-4e0e-a321-77cea020b1e0" (UID: "b4719eb1-a5e8-4e0e-a321-77cea020b1e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.484773 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4719eb1-a5e8-4e0e-a321-77cea020b1e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.503760 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.505380 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.507798 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.569562 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.583256 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.645573 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 17:03:07 crc kubenswrapper[4918]: E0319 17:03:07.646099 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4719eb1-a5e8-4e0e-a321-77cea020b1e0" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.646122 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4719eb1-a5e8-4e0e-a321-77cea020b1e0" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.646421 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4719eb1-a5e8-4e0e-a321-77cea020b1e0" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.647208 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.651043 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.651183 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.651420 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.656891 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.806090 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.806832 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzswq\" (UniqueName: \"kubernetes.io/projected/f8c29b03-6e93-446c-911f-af6e2e3ca36b-kube-api-access-fzswq\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.806927 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.807064 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.807156 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.909566 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzswq\" (UniqueName: \"kubernetes.io/projected/f8c29b03-6e93-446c-911f-af6e2e3ca36b-kube-api-access-fzswq\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.909622 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.909696 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.909733 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.909859 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.914059 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.914681 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.914767 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.916114 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8c29b03-6e93-446c-911f-af6e2e3ca36b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.927602 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzswq\" (UniqueName: \"kubernetes.io/projected/f8c29b03-6e93-446c-911f-af6e2e3ca36b-kube-api-access-fzswq\") pod \"nova-cell1-novncproxy-0\" (UID: \"f8c29b03-6e93-446c-911f-af6e2e3ca36b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:07 crc kubenswrapper[4918]: I0319 17:03:07.968694 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.266040 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.445971 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8xw6t"] Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.447946 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.487699 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8xw6t"] Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.555782 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.622786 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4719eb1-a5e8-4e0e-a321-77cea020b1e0" path="/var/lib/kubelet/pods/b4719eb1-a5e8-4e0e-a321-77cea020b1e0/volumes" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.628249 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.628292 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.628333 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftmdv\" (UniqueName: \"kubernetes.io/projected/bb03378b-77b3-44c8-97eb-558868797b23-kube-api-access-ftmdv\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.628369 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-config\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.628440 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.628482 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.730184 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.730242 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.730290 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftmdv\" (UniqueName: \"kubernetes.io/projected/bb03378b-77b3-44c8-97eb-558868797b23-kube-api-access-ftmdv\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.730338 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-config\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.730765 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.730820 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.731197 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-config\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.731225 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.731852 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.732705 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.732789 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.747459 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftmdv\" (UniqueName: \"kubernetes.io/projected/bb03378b-77b3-44c8-97eb-558868797b23-kube-api-access-ftmdv\") pod \"dnsmasq-dns-5fd9b586ff-8xw6t\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:08 crc kubenswrapper[4918]: I0319 17:03:08.772149 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:09 crc kubenswrapper[4918]: I0319 17:03:09.270285 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8xw6t"] Mar 19 17:03:09 crc kubenswrapper[4918]: I0319 17:03:09.276649 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f8c29b03-6e93-446c-911f-af6e2e3ca36b","Type":"ContainerStarted","Data":"0c575c759efb61eebe9df1548b6b4ac5cf4230eaaccc55e1f6b6f91193e11144"} Mar 19 17:03:09 crc kubenswrapper[4918]: I0319 17:03:09.276686 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f8c29b03-6e93-446c-911f-af6e2e3ca36b","Type":"ContainerStarted","Data":"fe4983761fb67a8b0eb71bba6ab86450c5c5038e83159aa631a6fdd9cc05b568"} Mar 19 17:03:09 crc kubenswrapper[4918]: I0319 17:03:09.293325 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.293307888 podStartE2EDuration="2.293307888s" podCreationTimestamp="2026-03-19 17:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:03:09.289213746 +0000 UTC m=+1401.411412994" watchObservedRunningTime="2026-03-19 17:03:09.293307888 +0000 UTC m=+1401.415507136" Mar 19 17:03:10 crc kubenswrapper[4918]: I0319 17:03:10.021544 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:03:10 crc kubenswrapper[4918]: I0319 17:03:10.021844 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:03:10 crc kubenswrapper[4918]: I0319 17:03:10.080606 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:03:10 crc kubenswrapper[4918]: I0319 17:03:10.283159 4918 generic.go:334] "Generic (PLEG): container finished" podID="bb03378b-77b3-44c8-97eb-558868797b23" containerID="a54debd2d584f5dd7a64dbe4333b64015d3eb45944209d5ab905d10610eb1ef0" exitCode=0 Mar 19 17:03:10 crc kubenswrapper[4918]: I0319 17:03:10.283464 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" event={"ID":"bb03378b-77b3-44c8-97eb-558868797b23","Type":"ContainerDied","Data":"a54debd2d584f5dd7a64dbe4333b64015d3eb45944209d5ab905d10610eb1ef0"} Mar 19 17:03:10 crc kubenswrapper[4918]: I0319 17:03:10.283584 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" event={"ID":"bb03378b-77b3-44c8-97eb-558868797b23","Type":"ContainerStarted","Data":"1a288324b81aa1645f5a582b075c14b40c894adf3cfec812d0fe0cd7f25170ed"} Mar 19 17:03:10 crc kubenswrapper[4918]: I0319 17:03:10.367475 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:03:10 crc kubenswrapper[4918]: I0319 17:03:10.458784 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nzvrq"] Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.129849 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.130385 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="ceilometer-central-agent" containerID="cri-o://2332da299a6016bc1e4e3225394e23c2e0a60415941caa5b25083eaaa19ba94a" gracePeriod=30 Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.130448 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="sg-core" containerID="cri-o://dcc05602faa287a2f356b257bcdfb941489ebba134aa714375178be7494c2e40" gracePeriod=30 Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.130490 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="ceilometer-notification-agent" containerID="cri-o://dc17de0ca55333c2eb30c00cdbb4fe25b567faf113a8e1b2b604547e126de418" gracePeriod=30 Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.130491 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="proxy-httpd" containerID="cri-o://a13c72570122bbc97a9092eaa5200338f30c37b9c161c6bd8ed94e2986bf37bd" gracePeriod=30 Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.294579 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" event={"ID":"bb03378b-77b3-44c8-97eb-558868797b23","Type":"ContainerStarted","Data":"1416871aa3f03c85b78ec34a93f66e562966e8e95eb12cd7a99f7c5d4afb93dc"} Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.294716 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.297695 4918 generic.go:334] "Generic (PLEG): container finished" podID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerID="dcc05602faa287a2f356b257bcdfb941489ebba134aa714375178be7494c2e40" exitCode=2 Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.297755 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"355bfabe-8824-4ee1-9bf5-8c808a15fa70","Type":"ContainerDied","Data":"dcc05602faa287a2f356b257bcdfb941489ebba134aa714375178be7494c2e40"} Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.370631 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" podStartSLOduration=3.370608272 podStartE2EDuration="3.370608272s" podCreationTimestamp="2026-03-19 17:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:03:11.323390133 +0000 UTC m=+1403.445589381" watchObservedRunningTime="2026-03-19 17:03:11.370608272 +0000 UTC m=+1403.492807520" Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.373408 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.373615 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9af7f82f-d552-4383-8c74-6cec9b797315" containerName="nova-api-log" containerID="cri-o://49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a" gracePeriod=30 Mar 19 17:03:11 crc kubenswrapper[4918]: I0319 17:03:11.373656 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9af7f82f-d552-4383-8c74-6cec9b797315" containerName="nova-api-api" containerID="cri-o://27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45" gracePeriod=30 Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.312851 4918 generic.go:334] "Generic (PLEG): container finished" podID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerID="a13c72570122bbc97a9092eaa5200338f30c37b9c161c6bd8ed94e2986bf37bd" exitCode=0 Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.313096 4918 generic.go:334] "Generic (PLEG): container finished" podID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerID="dc17de0ca55333c2eb30c00cdbb4fe25b567faf113a8e1b2b604547e126de418" exitCode=0 Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.313105 4918 generic.go:334] "Generic (PLEG): container finished" podID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerID="2332da299a6016bc1e4e3225394e23c2e0a60415941caa5b25083eaaa19ba94a" exitCode=0 Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.313163 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"355bfabe-8824-4ee1-9bf5-8c808a15fa70","Type":"ContainerDied","Data":"a13c72570122bbc97a9092eaa5200338f30c37b9c161c6bd8ed94e2986bf37bd"} Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.313189 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"355bfabe-8824-4ee1-9bf5-8c808a15fa70","Type":"ContainerDied","Data":"dc17de0ca55333c2eb30c00cdbb4fe25b567faf113a8e1b2b604547e126de418"} Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.313199 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"355bfabe-8824-4ee1-9bf5-8c808a15fa70","Type":"ContainerDied","Data":"2332da299a6016bc1e4e3225394e23c2e0a60415941caa5b25083eaaa19ba94a"} Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.315721 4918 generic.go:334] "Generic (PLEG): container finished" podID="9af7f82f-d552-4383-8c74-6cec9b797315" containerID="49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a" exitCode=143 Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.315812 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9af7f82f-d552-4383-8c74-6cec9b797315","Type":"ContainerDied","Data":"49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a"} Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.316617 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nzvrq" podUID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" containerName="registry-server" containerID="cri-o://19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f" gracePeriod=2 Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.680038 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.847117 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-combined-ca-bundle\") pod \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.847209 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-sg-core-conf-yaml\") pod \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.847235 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w79d5\" (UniqueName: \"kubernetes.io/projected/355bfabe-8824-4ee1-9bf5-8c808a15fa70-kube-api-access-w79d5\") pod \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.847410 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-run-httpd\") pod \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.847441 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-scripts\") pod \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.847465 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-ceilometer-tls-certs\") pod \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.847544 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-config-data\") pod \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.847579 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-log-httpd\") pod \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\" (UID: \"355bfabe-8824-4ee1-9bf5-8c808a15fa70\") " Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.848479 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "355bfabe-8824-4ee1-9bf5-8c808a15fa70" (UID: "355bfabe-8824-4ee1-9bf5-8c808a15fa70"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.850322 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "355bfabe-8824-4ee1-9bf5-8c808a15fa70" (UID: "355bfabe-8824-4ee1-9bf5-8c808a15fa70"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.861694 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-scripts" (OuterVolumeSpecName: "scripts") pod "355bfabe-8824-4ee1-9bf5-8c808a15fa70" (UID: "355bfabe-8824-4ee1-9bf5-8c808a15fa70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.909939 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355bfabe-8824-4ee1-9bf5-8c808a15fa70-kube-api-access-w79d5" (OuterVolumeSpecName: "kube-api-access-w79d5") pod "355bfabe-8824-4ee1-9bf5-8c808a15fa70" (UID: "355bfabe-8824-4ee1-9bf5-8c808a15fa70"). InnerVolumeSpecName "kube-api-access-w79d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.953679 4918 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.953934 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.953945 4918 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/355bfabe-8824-4ee1-9bf5-8c808a15fa70-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.953954 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w79d5\" (UniqueName: \"kubernetes.io/projected/355bfabe-8824-4ee1-9bf5-8c808a15fa70-kube-api-access-w79d5\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.966820 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "355bfabe-8824-4ee1-9bf5-8c808a15fa70" (UID: "355bfabe-8824-4ee1-9bf5-8c808a15fa70"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:12 crc kubenswrapper[4918]: I0319 17:03:12.969663 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.056159 4918 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.089634 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-config-data" (OuterVolumeSpecName: "config-data") pod "355bfabe-8824-4ee1-9bf5-8c808a15fa70" (UID: "355bfabe-8824-4ee1-9bf5-8c808a15fa70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.093600 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "355bfabe-8824-4ee1-9bf5-8c808a15fa70" (UID: "355bfabe-8824-4ee1-9bf5-8c808a15fa70"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.120457 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "355bfabe-8824-4ee1-9bf5-8c808a15fa70" (UID: "355bfabe-8824-4ee1-9bf5-8c808a15fa70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.160370 4918 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.160396 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.160407 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355bfabe-8824-4ee1-9bf5-8c808a15fa70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.280010 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.347728 4918 generic.go:334] "Generic (PLEG): container finished" podID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" containerID="19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f" exitCode=0 Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.347785 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzvrq" event={"ID":"2fa4107c-ea0f-4101-9fd7-e7914e076b81","Type":"ContainerDied","Data":"19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f"} Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.347817 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzvrq" event={"ID":"2fa4107c-ea0f-4101-9fd7-e7914e076b81","Type":"ContainerDied","Data":"32d266ecf1af7e4db412ae1a35e80f65d7cc83883b9c81c2b6960c7d1eaf036f"} Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.347834 4918 scope.go:117] "RemoveContainer" containerID="19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.347953 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzvrq" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.357512 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"355bfabe-8824-4ee1-9bf5-8c808a15fa70","Type":"ContainerDied","Data":"9d5501e5f7e95842c8cb93da7e7c18df2c93aae8316c74f1ce4194d4b1cc7b00"} Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.357634 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.382744 4918 scope.go:117] "RemoveContainer" containerID="d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.393756 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.414550 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.420189 4918 scope.go:117] "RemoveContainer" containerID="99012c0586e191a0d633b73064d766e8df52bb49efa0c125f56d41818c65041a" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.441677 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.469503 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-catalog-content\") pod \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.469709 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzjf7\" (UniqueName: \"kubernetes.io/projected/2fa4107c-ea0f-4101-9fd7-e7914e076b81-kube-api-access-tzjf7\") pod \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.469737 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-utilities\") pod \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\" (UID: \"2fa4107c-ea0f-4101-9fd7-e7914e076b81\") " Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.472733 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-utilities" (OuterVolumeSpecName: "utilities") pod "2fa4107c-ea0f-4101-9fd7-e7914e076b81" (UID: "2fa4107c-ea0f-4101-9fd7-e7914e076b81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:13 crc kubenswrapper[4918]: E0319 17:03:13.486914 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="proxy-httpd" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.486941 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="proxy-httpd" Mar 19 17:03:13 crc kubenswrapper[4918]: E0319 17:03:13.486986 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="ceilometer-notification-agent" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.486993 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="ceilometer-notification-agent" Mar 19 17:03:13 crc kubenswrapper[4918]: E0319 17:03:13.487013 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="sg-core" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.487019 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="sg-core" Mar 19 17:03:13 crc kubenswrapper[4918]: E0319 17:03:13.487044 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="ceilometer-central-agent" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.487050 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="ceilometer-central-agent" Mar 19 17:03:13 crc kubenswrapper[4918]: E0319 17:03:13.487066 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" containerName="extract-content" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.487072 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" containerName="extract-content" Mar 19 17:03:13 crc kubenswrapper[4918]: E0319 17:03:13.487088 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" containerName="extract-utilities" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.487095 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" containerName="extract-utilities" Mar 19 17:03:13 crc kubenswrapper[4918]: E0319 17:03:13.487107 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" containerName="registry-server" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.487113 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" containerName="registry-server" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.487447 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="ceilometer-central-agent" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.487459 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="sg-core" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.487480 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" containerName="registry-server" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.487499 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="proxy-httpd" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.489404 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" containerName="ceilometer-notification-agent" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.492315 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.492421 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.503966 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa4107c-ea0f-4101-9fd7-e7914e076b81-kube-api-access-tzjf7" (OuterVolumeSpecName: "kube-api-access-tzjf7") pod "2fa4107c-ea0f-4101-9fd7-e7914e076b81" (UID: "2fa4107c-ea0f-4101-9fd7-e7914e076b81"). InnerVolumeSpecName "kube-api-access-tzjf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.504457 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.504717 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.504900 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.520742 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fa4107c-ea0f-4101-9fd7-e7914e076b81" (UID: "2fa4107c-ea0f-4101-9fd7-e7914e076b81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.555794 4918 scope.go:117] "RemoveContainer" containerID="19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f" Mar 19 17:03:13 crc kubenswrapper[4918]: E0319 17:03:13.556163 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f\": container with ID starting with 19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f not found: ID does not exist" containerID="19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.556190 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f"} err="failed to get container status \"19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f\": rpc error: code = NotFound desc = could not find container \"19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f\": container with ID starting with 19d69215e8a88e1bf62270e9b84caedf66e7d07a4d4d0ab6185264b8f160ca2f not found: ID does not exist" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.556211 4918 scope.go:117] "RemoveContainer" containerID="d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0" Mar 19 17:03:13 crc kubenswrapper[4918]: E0319 17:03:13.556701 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0\": container with ID starting with d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0 not found: ID does not exist" containerID="d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.556815 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0"} err="failed to get container status \"d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0\": rpc error: code = NotFound desc = could not find container \"d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0\": container with ID starting with d39b63308690044d37063ccd52a7f28f6d2bfb89a06e83ddef1cf73add51b1f0 not found: ID does not exist" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.556828 4918 scope.go:117] "RemoveContainer" containerID="99012c0586e191a0d633b73064d766e8df52bb49efa0c125f56d41818c65041a" Mar 19 17:03:13 crc kubenswrapper[4918]: E0319 17:03:13.559587 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99012c0586e191a0d633b73064d766e8df52bb49efa0c125f56d41818c65041a\": container with ID starting with 99012c0586e191a0d633b73064d766e8df52bb49efa0c125f56d41818c65041a not found: ID does not exist" containerID="99012c0586e191a0d633b73064d766e8df52bb49efa0c125f56d41818c65041a" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.559619 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99012c0586e191a0d633b73064d766e8df52bb49efa0c125f56d41818c65041a"} err="failed to get container status \"99012c0586e191a0d633b73064d766e8df52bb49efa0c125f56d41818c65041a\": rpc error: code = NotFound desc = could not find container \"99012c0586e191a0d633b73064d766e8df52bb49efa0c125f56d41818c65041a\": container with ID starting with 99012c0586e191a0d633b73064d766e8df52bb49efa0c125f56d41818c65041a not found: ID does not exist" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.559638 4918 scope.go:117] "RemoveContainer" containerID="a13c72570122bbc97a9092eaa5200338f30c37b9c161c6bd8ed94e2986bf37bd" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.582287 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzjf7\" (UniqueName: \"kubernetes.io/projected/2fa4107c-ea0f-4101-9fd7-e7914e076b81-kube-api-access-tzjf7\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.582312 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.582340 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa4107c-ea0f-4101-9fd7-e7914e076b81-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.604657 4918 scope.go:117] "RemoveContainer" containerID="dcc05602faa287a2f356b257bcdfb941489ebba134aa714375178be7494c2e40" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.634439 4918 scope.go:117] "RemoveContainer" containerID="dc17de0ca55333c2eb30c00cdbb4fe25b567faf113a8e1b2b604547e126de418" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.664695 4918 scope.go:117] "RemoveContainer" containerID="2332da299a6016bc1e4e3225394e23c2e0a60415941caa5b25083eaaa19ba94a" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.685405 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.685466 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-run-httpd\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.685569 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-scripts\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.685588 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.685647 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.685674 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-config-data\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.685699 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgrk2\" (UniqueName: \"kubernetes.io/projected/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-kube-api-access-pgrk2\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.685754 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-log-httpd\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: E0319 17:03:13.705745 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa4107c_ea0f_4101_9fd7_e7914e076b81.slice\": RecentStats: unable to find data in memory cache]" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.725500 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nzvrq"] Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.739253 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nzvrq"] Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.787207 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgrk2\" (UniqueName: \"kubernetes.io/projected/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-kube-api-access-pgrk2\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.787300 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-log-httpd\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.787813 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-log-httpd\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.787869 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.787886 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-run-httpd\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.787972 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-scripts\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.787995 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.788081 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.788201 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-config-data\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.788862 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-run-httpd\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.791759 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-scripts\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.811798 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.811964 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.812113 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-config-data\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.812752 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.816787 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgrk2\" (UniqueName: \"kubernetes.io/projected/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-kube-api-access-pgrk2\") pod \"ceilometer-0\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " pod="openstack/ceilometer-0" Mar 19 17:03:13 crc kubenswrapper[4918]: I0319 17:03:13.900956 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:03:14 crc kubenswrapper[4918]: I0319 17:03:14.428498 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:14 crc kubenswrapper[4918]: I0319 17:03:14.441821 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:14 crc kubenswrapper[4918]: I0319 17:03:14.598910 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fa4107c-ea0f-4101-9fd7-e7914e076b81" path="/var/lib/kubelet/pods/2fa4107c-ea0f-4101-9fd7-e7914e076b81/volumes" Mar 19 17:03:14 crc kubenswrapper[4918]: I0319 17:03:14.600009 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="355bfabe-8824-4ee1-9bf5-8c808a15fa70" path="/var/lib/kubelet/pods/355bfabe-8824-4ee1-9bf5-8c808a15fa70/volumes" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.338069 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.378382 4918 generic.go:334] "Generic (PLEG): container finished" podID="9af7f82f-d552-4383-8c74-6cec9b797315" containerID="27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45" exitCode=0 Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.378450 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9af7f82f-d552-4383-8c74-6cec9b797315","Type":"ContainerDied","Data":"27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45"} Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.378478 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9af7f82f-d552-4383-8c74-6cec9b797315","Type":"ContainerDied","Data":"9be60290bba021a4de5e8aebdd028ee7f5d2d88a3e6b7bf959f32eff538fb2f7"} Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.378493 4918 scope.go:117] "RemoveContainer" containerID="27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.378626 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.381650 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e","Type":"ContainerStarted","Data":"49eacfec3c64fb5adf21222e8e3d25c3837769eeefdf8f893c363b6e49acc5a6"} Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.408194 4918 scope.go:117] "RemoveContainer" containerID="49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.424295 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8jjv\" (UniqueName: \"kubernetes.io/projected/9af7f82f-d552-4383-8c74-6cec9b797315-kube-api-access-s8jjv\") pod \"9af7f82f-d552-4383-8c74-6cec9b797315\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.426806 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-config-data\") pod \"9af7f82f-d552-4383-8c74-6cec9b797315\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.426845 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-combined-ca-bundle\") pod \"9af7f82f-d552-4383-8c74-6cec9b797315\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.426919 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9af7f82f-d552-4383-8c74-6cec9b797315-logs\") pod \"9af7f82f-d552-4383-8c74-6cec9b797315\" (UID: \"9af7f82f-d552-4383-8c74-6cec9b797315\") " Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.428182 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af7f82f-d552-4383-8c74-6cec9b797315-logs" (OuterVolumeSpecName: "logs") pod "9af7f82f-d552-4383-8c74-6cec9b797315" (UID: "9af7f82f-d552-4383-8c74-6cec9b797315"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.430342 4918 scope.go:117] "RemoveContainer" containerID="27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45" Mar 19 17:03:15 crc kubenswrapper[4918]: E0319 17:03:15.434371 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45\": container with ID starting with 27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45 not found: ID does not exist" containerID="27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.434901 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45"} err="failed to get container status \"27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45\": rpc error: code = NotFound desc = could not find container \"27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45\": container with ID starting with 27763c918591e01faed23425a23c9b1e942f5f18c4510068a08d73ecc0774f45 not found: ID does not exist" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.434984 4918 scope.go:117] "RemoveContainer" containerID="49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a" Mar 19 17:03:15 crc kubenswrapper[4918]: E0319 17:03:15.435274 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a\": container with ID starting with 49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a not found: ID does not exist" containerID="49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.435302 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a"} err="failed to get container status \"49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a\": rpc error: code = NotFound desc = could not find container \"49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a\": container with ID starting with 49a3ec0b3ca439663fa4d2d5e229cb93e8709cc9ade0632d8221f6add41ebd9a not found: ID does not exist" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.452725 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af7f82f-d552-4383-8c74-6cec9b797315-kube-api-access-s8jjv" (OuterVolumeSpecName: "kube-api-access-s8jjv") pod "9af7f82f-d552-4383-8c74-6cec9b797315" (UID: "9af7f82f-d552-4383-8c74-6cec9b797315"). InnerVolumeSpecName "kube-api-access-s8jjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.492385 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-config-data" (OuterVolumeSpecName: "config-data") pod "9af7f82f-d552-4383-8c74-6cec9b797315" (UID: "9af7f82f-d552-4383-8c74-6cec9b797315"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.509713 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9af7f82f-d552-4383-8c74-6cec9b797315" (UID: "9af7f82f-d552-4383-8c74-6cec9b797315"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.530512 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.530609 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af7f82f-d552-4383-8c74-6cec9b797315-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.530620 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9af7f82f-d552-4383-8c74-6cec9b797315-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.530628 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8jjv\" (UniqueName: \"kubernetes.io/projected/9af7f82f-d552-4383-8c74-6cec9b797315-kube-api-access-s8jjv\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.715376 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.729661 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.743080 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:15 crc kubenswrapper[4918]: E0319 17:03:15.743683 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af7f82f-d552-4383-8c74-6cec9b797315" containerName="nova-api-api" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.743706 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af7f82f-d552-4383-8c74-6cec9b797315" containerName="nova-api-api" Mar 19 17:03:15 crc kubenswrapper[4918]: E0319 17:03:15.743737 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af7f82f-d552-4383-8c74-6cec9b797315" containerName="nova-api-log" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.743744 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af7f82f-d552-4383-8c74-6cec9b797315" containerName="nova-api-log" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.743979 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af7f82f-d552-4383-8c74-6cec9b797315" containerName="nova-api-log" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.744005 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af7f82f-d552-4383-8c74-6cec9b797315" containerName="nova-api-api" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.745409 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.748771 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.748885 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.752814 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.756714 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.836788 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.836895 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xb58\" (UniqueName: \"kubernetes.io/projected/4755aa2d-fc74-4d06-a4cc-9a185df5068c-kube-api-access-7xb58\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.836987 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-config-data\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.837070 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-public-tls-certs\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.837116 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.837169 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4755aa2d-fc74-4d06-a4cc-9a185df5068c-logs\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.938973 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xb58\" (UniqueName: \"kubernetes.io/projected/4755aa2d-fc74-4d06-a4cc-9a185df5068c-kube-api-access-7xb58\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.939066 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-config-data\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.939135 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-public-tls-certs\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.939169 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.939215 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4755aa2d-fc74-4d06-a4cc-9a185df5068c-logs\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.939254 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.939877 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4755aa2d-fc74-4d06-a4cc-9a185df5068c-logs\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.942936 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-config-data\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.943090 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-public-tls-certs\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.943335 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.944336 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:15 crc kubenswrapper[4918]: I0319 17:03:15.960091 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xb58\" (UniqueName: \"kubernetes.io/projected/4755aa2d-fc74-4d06-a4cc-9a185df5068c-kube-api-access-7xb58\") pod \"nova-api-0\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " pod="openstack/nova-api-0" Mar 19 17:03:16 crc kubenswrapper[4918]: I0319 17:03:16.062294 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:03:16 crc kubenswrapper[4918]: I0319 17:03:16.396300 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e","Type":"ContainerStarted","Data":"8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2"} Mar 19 17:03:16 crc kubenswrapper[4918]: I0319 17:03:16.597429 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af7f82f-d552-4383-8c74-6cec9b797315" path="/var/lib/kubelet/pods/9af7f82f-d552-4383-8c74-6cec9b797315/volumes" Mar 19 17:03:16 crc kubenswrapper[4918]: I0319 17:03:16.601668 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:17 crc kubenswrapper[4918]: I0319 17:03:17.410214 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e","Type":"ContainerStarted","Data":"1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb"} Mar 19 17:03:17 crc kubenswrapper[4918]: I0319 17:03:17.413187 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4755aa2d-fc74-4d06-a4cc-9a185df5068c","Type":"ContainerStarted","Data":"3808d87ac79a77d49e39b5b5281130f71ef37c9240f472174ad713b9edb96034"} Mar 19 17:03:17 crc kubenswrapper[4918]: I0319 17:03:17.413246 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4755aa2d-fc74-4d06-a4cc-9a185df5068c","Type":"ContainerStarted","Data":"811b55828b1016901fa27a953a00d10f187168686182e4842c96e9065a56807e"} Mar 19 17:03:17 crc kubenswrapper[4918]: I0319 17:03:17.413260 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4755aa2d-fc74-4d06-a4cc-9a185df5068c","Type":"ContainerStarted","Data":"26469251e5d0c033ab9cc2388811d13575817a7cb4c60ed4d70689670d5a9063"} Mar 19 17:03:17 crc kubenswrapper[4918]: I0319 17:03:17.440630 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.4406082700000002 podStartE2EDuration="2.44060827s" podCreationTimestamp="2026-03-19 17:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:03:17.431884981 +0000 UTC m=+1409.554084229" watchObservedRunningTime="2026-03-19 17:03:17.44060827 +0000 UTC m=+1409.562807518" Mar 19 17:03:17 crc kubenswrapper[4918]: I0319 17:03:17.969718 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:17 crc kubenswrapper[4918]: I0319 17:03:17.999870 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.447468 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.651401 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-q4qqh"] Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.653320 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.658795 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.665973 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.673821 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q4qqh"] Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.773326 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.795087 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.795205 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrqpj\" (UniqueName: \"kubernetes.io/projected/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-kube-api-access-rrqpj\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.795262 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-scripts\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.795394 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-config-data\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.863898 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-mfkf8"] Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.864130 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" podUID="436a6713-1e4a-474f-80e8-793f725561da" containerName="dnsmasq-dns" containerID="cri-o://c37c634d9828512c0eb66601822616af02d77d6cc36a2c1e1f94024918a2cd00" gracePeriod=10 Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.897369 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-config-data\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.897618 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.897681 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrqpj\" (UniqueName: \"kubernetes.io/projected/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-kube-api-access-rrqpj\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.897718 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-scripts\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.906451 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.906487 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-scripts\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.906733 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-config-data\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.927051 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrqpj\" (UniqueName: \"kubernetes.io/projected/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-kube-api-access-rrqpj\") pod \"nova-cell1-cell-mapping-q4qqh\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:18 crc kubenswrapper[4918]: I0319 17:03:18.983998 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:19 crc kubenswrapper[4918]: I0319 17:03:19.441926 4918 generic.go:334] "Generic (PLEG): container finished" podID="436a6713-1e4a-474f-80e8-793f725561da" containerID="c37c634d9828512c0eb66601822616af02d77d6cc36a2c1e1f94024918a2cd00" exitCode=0 Mar 19 17:03:19 crc kubenswrapper[4918]: I0319 17:03:19.442872 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" event={"ID":"436a6713-1e4a-474f-80e8-793f725561da","Type":"ContainerDied","Data":"c37c634d9828512c0eb66601822616af02d77d6cc36a2c1e1f94024918a2cd00"} Mar 19 17:03:19 crc kubenswrapper[4918]: I0319 17:03:19.601642 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q4qqh"] Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.311018 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.436339 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-sb\") pod \"436a6713-1e4a-474f-80e8-793f725561da\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.436419 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-svc\") pod \"436a6713-1e4a-474f-80e8-793f725561da\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.436446 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtjcj\" (UniqueName: \"kubernetes.io/projected/436a6713-1e4a-474f-80e8-793f725561da-kube-api-access-gtjcj\") pod \"436a6713-1e4a-474f-80e8-793f725561da\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.436506 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-config\") pod \"436a6713-1e4a-474f-80e8-793f725561da\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.436567 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-swift-storage-0\") pod \"436a6713-1e4a-474f-80e8-793f725561da\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.436703 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-nb\") pod \"436a6713-1e4a-474f-80e8-793f725561da\" (UID: \"436a6713-1e4a-474f-80e8-793f725561da\") " Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.509907 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/436a6713-1e4a-474f-80e8-793f725561da-kube-api-access-gtjcj" (OuterVolumeSpecName: "kube-api-access-gtjcj") pod "436a6713-1e4a-474f-80e8-793f725561da" (UID: "436a6713-1e4a-474f-80e8-793f725561da"). InnerVolumeSpecName "kube-api-access-gtjcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.538985 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtjcj\" (UniqueName: \"kubernetes.io/projected/436a6713-1e4a-474f-80e8-793f725561da-kube-api-access-gtjcj\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.542378 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-config" (OuterVolumeSpecName: "config") pod "436a6713-1e4a-474f-80e8-793f725561da" (UID: "436a6713-1e4a-474f-80e8-793f725561da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.547252 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e","Type":"ContainerStarted","Data":"e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd"} Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.548035 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "436a6713-1e4a-474f-80e8-793f725561da" (UID: "436a6713-1e4a-474f-80e8-793f725561da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.550143 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q4qqh" event={"ID":"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7","Type":"ContainerStarted","Data":"6dea809c472c381d224dc1a3f8c04322c480ea79674d6d778a8bb2a9b3ced58d"} Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.551236 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q4qqh" event={"ID":"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7","Type":"ContainerStarted","Data":"92b3a1cd771a02107f6bc0f6d24bc0e1eb96aca06a3da938a731b7670da82b15"} Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.552266 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" event={"ID":"436a6713-1e4a-474f-80e8-793f725561da","Type":"ContainerDied","Data":"e8da49ea1e8ef48a248b25f4098e21909075bc5bf67e669a10c47fd01b3355c4"} Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.552320 4918 scope.go:117] "RemoveContainer" containerID="c37c634d9828512c0eb66601822616af02d77d6cc36a2c1e1f94024918a2cd00" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.552708 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-mfkf8" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.559106 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "436a6713-1e4a-474f-80e8-793f725561da" (UID: "436a6713-1e4a-474f-80e8-793f725561da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.579202 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-q4qqh" podStartSLOduration=2.579183412 podStartE2EDuration="2.579183412s" podCreationTimestamp="2026-03-19 17:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:03:20.568424568 +0000 UTC m=+1412.690623816" watchObservedRunningTime="2026-03-19 17:03:20.579183412 +0000 UTC m=+1412.701382660" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.585204 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "436a6713-1e4a-474f-80e8-793f725561da" (UID: "436a6713-1e4a-474f-80e8-793f725561da"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.589946 4918 scope.go:117] "RemoveContainer" containerID="9e5ff868ec777b6f5031bee3735d3ce46037cddcb23729ae2221942d8ccbba37" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.591297 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "436a6713-1e4a-474f-80e8-793f725561da" (UID: "436a6713-1e4a-474f-80e8-793f725561da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.640591 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.640725 4918 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.640783 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.640855 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.640907 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/436a6713-1e4a-474f-80e8-793f725561da-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.882617 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-mfkf8"] Mar 19 17:03:20 crc kubenswrapper[4918]: I0319 17:03:20.887405 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-mfkf8"] Mar 19 17:03:22 crc kubenswrapper[4918]: I0319 17:03:22.597433 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="436a6713-1e4a-474f-80e8-793f725561da" path="/var/lib/kubelet/pods/436a6713-1e4a-474f-80e8-793f725561da/volumes" Mar 19 17:03:25 crc kubenswrapper[4918]: I0319 17:03:25.621148 4918 generic.go:334] "Generic (PLEG): container finished" podID="83433c2c-ddcb-4f8d-ba54-e3dda42d12f7" containerID="6dea809c472c381d224dc1a3f8c04322c480ea79674d6d778a8bb2a9b3ced58d" exitCode=0 Mar 19 17:03:25 crc kubenswrapper[4918]: I0319 17:03:25.621224 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q4qqh" event={"ID":"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7","Type":"ContainerDied","Data":"6dea809c472c381d224dc1a3f8c04322c480ea79674d6d778a8bb2a9b3ced58d"} Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.063453 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.063505 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.546073 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-882pt"] Mar 19 17:03:26 crc kubenswrapper[4918]: E0319 17:03:26.546501 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436a6713-1e4a-474f-80e8-793f725561da" containerName="init" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.546578 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="436a6713-1e4a-474f-80e8-793f725561da" containerName="init" Mar 19 17:03:26 crc kubenswrapper[4918]: E0319 17:03:26.546629 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436a6713-1e4a-474f-80e8-793f725561da" containerName="dnsmasq-dns" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.546635 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="436a6713-1e4a-474f-80e8-793f725561da" containerName="dnsmasq-dns" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.546828 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="436a6713-1e4a-474f-80e8-793f725561da" containerName="dnsmasq-dns" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.548585 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.558092 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-882pt"] Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.658029 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-catalog-content\") pod \"redhat-marketplace-882pt\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.658292 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-utilities\") pod \"redhat-marketplace-882pt\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.658550 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49ksz\" (UniqueName: \"kubernetes.io/projected/2e859955-3304-49f6-a1e7-0f2d0eea96ca-kube-api-access-49ksz\") pod \"redhat-marketplace-882pt\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.760937 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49ksz\" (UniqueName: \"kubernetes.io/projected/2e859955-3304-49f6-a1e7-0f2d0eea96ca-kube-api-access-49ksz\") pod \"redhat-marketplace-882pt\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.761119 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-catalog-content\") pod \"redhat-marketplace-882pt\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.761198 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-utilities\") pod \"redhat-marketplace-882pt\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.766959 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-utilities\") pod \"redhat-marketplace-882pt\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.767115 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-catalog-content\") pod \"redhat-marketplace-882pt\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.794507 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49ksz\" (UniqueName: \"kubernetes.io/projected/2e859955-3304-49f6-a1e7-0f2d0eea96ca-kube-api-access-49ksz\") pod \"redhat-marketplace-882pt\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:26 crc kubenswrapper[4918]: I0319 17:03:26.870740 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.073669 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.234:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.073702 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.234:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.448453 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:27 crc kubenswrapper[4918]: W0319 17:03:27.456261 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e859955_3304_49f6_a1e7_0f2d0eea96ca.slice/crio-9ab2fa9556dfbe84863284aa453d9b4ba5c33b251ee0b8d1080f22ddc39f427c WatchSource:0}: Error finding container 9ab2fa9556dfbe84863284aa453d9b4ba5c33b251ee0b8d1080f22ddc39f427c: Status 404 returned error can't find the container with id 9ab2fa9556dfbe84863284aa453d9b4ba5c33b251ee0b8d1080f22ddc39f427c Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.456820 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-882pt"] Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.482068 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-combined-ca-bundle\") pod \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.482222 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrqpj\" (UniqueName: \"kubernetes.io/projected/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-kube-api-access-rrqpj\") pod \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.482342 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-config-data\") pod \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.482432 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-scripts\") pod \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\" (UID: \"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7\") " Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.490888 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-scripts" (OuterVolumeSpecName: "scripts") pod "83433c2c-ddcb-4f8d-ba54-e3dda42d12f7" (UID: "83433c2c-ddcb-4f8d-ba54-e3dda42d12f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.498757 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-kube-api-access-rrqpj" (OuterVolumeSpecName: "kube-api-access-rrqpj") pod "83433c2c-ddcb-4f8d-ba54-e3dda42d12f7" (UID: "83433c2c-ddcb-4f8d-ba54-e3dda42d12f7"). InnerVolumeSpecName "kube-api-access-rrqpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.520023 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-config-data" (OuterVolumeSpecName: "config-data") pod "83433c2c-ddcb-4f8d-ba54-e3dda42d12f7" (UID: "83433c2c-ddcb-4f8d-ba54-e3dda42d12f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.528678 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83433c2c-ddcb-4f8d-ba54-e3dda42d12f7" (UID: "83433c2c-ddcb-4f8d-ba54-e3dda42d12f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.585326 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.585369 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrqpj\" (UniqueName: \"kubernetes.io/projected/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-kube-api-access-rrqpj\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.585386 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.585400 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.639608 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q4qqh" event={"ID":"83433c2c-ddcb-4f8d-ba54-e3dda42d12f7","Type":"ContainerDied","Data":"92b3a1cd771a02107f6bc0f6d24bc0e1eb96aca06a3da938a731b7670da82b15"} Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.639656 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92b3a1cd771a02107f6bc0f6d24bc0e1eb96aca06a3da938a731b7670da82b15" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.639660 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q4qqh" Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.641823 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-882pt" event={"ID":"2e859955-3304-49f6-a1e7-0f2d0eea96ca","Type":"ContainerStarted","Data":"43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1"} Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.641852 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-882pt" event={"ID":"2e859955-3304-49f6-a1e7-0f2d0eea96ca","Type":"ContainerStarted","Data":"9ab2fa9556dfbe84863284aa453d9b4ba5c33b251ee0b8d1080f22ddc39f427c"} Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.940166 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.940839 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerName="nova-api-log" containerID="cri-o://811b55828b1016901fa27a953a00d10f187168686182e4842c96e9065a56807e" gracePeriod=30 Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.940903 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerName="nova-api-api" containerID="cri-o://3808d87ac79a77d49e39b5b5281130f71ef37c9240f472174ad713b9edb96034" gracePeriod=30 Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.965188 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.965452 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7861f8ee-6769-49fa-b7ab-33d6ebb50eee" containerName="nova-scheduler-scheduler" containerID="cri-o://5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae" gracePeriod=30 Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.981947 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.982192 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerName="nova-metadata-log" containerID="cri-o://1f2a5b41eac6f7be33f5468c45045c789902f582422311f5dbe3dc455552bf73" gracePeriod=30 Mar 19 17:03:27 crc kubenswrapper[4918]: I0319 17:03:27.982297 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerName="nova-metadata-metadata" containerID="cri-o://8ffea604ee8670992f9c21a0358aa00f778c4ad7836fc1c4ce084b9c929d6415" gracePeriod=30 Mar 19 17:03:28 crc kubenswrapper[4918]: I0319 17:03:28.653933 4918 generic.go:334] "Generic (PLEG): container finished" podID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerID="1f2a5b41eac6f7be33f5468c45045c789902f582422311f5dbe3dc455552bf73" exitCode=143 Mar 19 17:03:28 crc kubenswrapper[4918]: I0319 17:03:28.654006 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c864e7-f676-4c0d-894b-c31c175fccd2","Type":"ContainerDied","Data":"1f2a5b41eac6f7be33f5468c45045c789902f582422311f5dbe3dc455552bf73"} Mar 19 17:03:28 crc kubenswrapper[4918]: I0319 17:03:28.656578 4918 generic.go:334] "Generic (PLEG): container finished" podID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerID="811b55828b1016901fa27a953a00d10f187168686182e4842c96e9065a56807e" exitCode=143 Mar 19 17:03:28 crc kubenswrapper[4918]: I0319 17:03:28.656605 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4755aa2d-fc74-4d06-a4cc-9a185df5068c","Type":"ContainerDied","Data":"811b55828b1016901fa27a953a00d10f187168686182e4842c96e9065a56807e"} Mar 19 17:03:28 crc kubenswrapper[4918]: I0319 17:03:28.658324 4918 generic.go:334] "Generic (PLEG): container finished" podID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" containerID="43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1" exitCode=0 Mar 19 17:03:28 crc kubenswrapper[4918]: I0319 17:03:28.658365 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-882pt" event={"ID":"2e859955-3304-49f6-a1e7-0f2d0eea96ca","Type":"ContainerDied","Data":"43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1"} Mar 19 17:03:30 crc kubenswrapper[4918]: E0319 17:03:30.431274 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae is running failed: container process not found" containerID="5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 17:03:30 crc kubenswrapper[4918]: E0319 17:03:30.433015 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae is running failed: container process not found" containerID="5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 17:03:30 crc kubenswrapper[4918]: E0319 17:03:30.433243 4918 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae is running failed: container process not found" containerID="5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 17:03:30 crc kubenswrapper[4918]: E0319 17:03:30.433278 4918 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7861f8ee-6769-49fa-b7ab-33d6ebb50eee" containerName="nova-scheduler-scheduler" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.481001 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.549908 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q7mzv"] Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.550598 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-combined-ca-bundle\") pod \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.550676 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sklql\" (UniqueName: \"kubernetes.io/projected/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-kube-api-access-sklql\") pod \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " Mar 19 17:03:30 crc kubenswrapper[4918]: E0319 17:03:30.550832 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7861f8ee-6769-49fa-b7ab-33d6ebb50eee" containerName="nova-scheduler-scheduler" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.550910 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7861f8ee-6769-49fa-b7ab-33d6ebb50eee" containerName="nova-scheduler-scheduler" Mar 19 17:03:30 crc kubenswrapper[4918]: E0319 17:03:30.551017 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83433c2c-ddcb-4f8d-ba54-e3dda42d12f7" containerName="nova-manage" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.551083 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="83433c2c-ddcb-4f8d-ba54-e3dda42d12f7" containerName="nova-manage" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.551357 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="83433c2c-ddcb-4f8d-ba54-e3dda42d12f7" containerName="nova-manage" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.551454 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7861f8ee-6769-49fa-b7ab-33d6ebb50eee" containerName="nova-scheduler-scheduler" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.550841 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-config-data\") pod \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\" (UID: \"7861f8ee-6769-49fa-b7ab-33d6ebb50eee\") " Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.555239 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.558299 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-kube-api-access-sklql" (OuterVolumeSpecName: "kube-api-access-sklql") pod "7861f8ee-6769-49fa-b7ab-33d6ebb50eee" (UID: "7861f8ee-6769-49fa-b7ab-33d6ebb50eee"). InnerVolumeSpecName "kube-api-access-sklql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.562711 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7mzv"] Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.590341 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7861f8ee-6769-49fa-b7ab-33d6ebb50eee" (UID: "7861f8ee-6769-49fa-b7ab-33d6ebb50eee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.620008 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-config-data" (OuterVolumeSpecName: "config-data") pod "7861f8ee-6769-49fa-b7ab-33d6ebb50eee" (UID: "7861f8ee-6769-49fa-b7ab-33d6ebb50eee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.657335 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjhtn\" (UniqueName: \"kubernetes.io/projected/7e685daf-cafd-47fe-8640-34a654d4bb62-kube-api-access-hjhtn\") pod \"certified-operators-q7mzv\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.657446 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-utilities\") pod \"certified-operators-q7mzv\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.657643 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-catalog-content\") pod \"certified-operators-q7mzv\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.657720 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.657736 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sklql\" (UniqueName: \"kubernetes.io/projected/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-kube-api-access-sklql\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.657751 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7861f8ee-6769-49fa-b7ab-33d6ebb50eee-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.684872 4918 generic.go:334] "Generic (PLEG): container finished" podID="7861f8ee-6769-49fa-b7ab-33d6ebb50eee" containerID="5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae" exitCode=0 Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.684910 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.684978 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7861f8ee-6769-49fa-b7ab-33d6ebb50eee","Type":"ContainerDied","Data":"5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae"} Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.685087 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7861f8ee-6769-49fa-b7ab-33d6ebb50eee","Type":"ContainerDied","Data":"b58db43748df310e2b077fcecdcb25ba1b9b2fff34756a1fab9eebc4ffccb99f"} Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.685112 4918 scope.go:117] "RemoveContainer" containerID="5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.692256 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-882pt" event={"ID":"2e859955-3304-49f6-a1e7-0f2d0eea96ca","Type":"ContainerStarted","Data":"73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa"} Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.714795 4918 scope.go:117] "RemoveContainer" containerID="5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae" Mar 19 17:03:30 crc kubenswrapper[4918]: E0319 17:03:30.715165 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae\": container with ID starting with 5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae not found: ID does not exist" containerID="5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.715203 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae"} err="failed to get container status \"5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae\": rpc error: code = NotFound desc = could not find container \"5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae\": container with ID starting with 5255d8ce79f76b5cbfc2464afb1d9521f2e258dda49013e6b173551ed1c40aae not found: ID does not exist" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.717940 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.729786 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.740451 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.742333 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.746139 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.752132 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.761621 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-catalog-content\") pod \"certified-operators-q7mzv\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.761673 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602c7fd6-d47b-4b05-880b-7a03afb02c49-config-data\") pod \"nova-scheduler-0\" (UID: \"602c7fd6-d47b-4b05-880b-7a03afb02c49\") " pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.761752 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c48wz\" (UniqueName: \"kubernetes.io/projected/602c7fd6-d47b-4b05-880b-7a03afb02c49-kube-api-access-c48wz\") pod \"nova-scheduler-0\" (UID: \"602c7fd6-d47b-4b05-880b-7a03afb02c49\") " pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.761861 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602c7fd6-d47b-4b05-880b-7a03afb02c49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"602c7fd6-d47b-4b05-880b-7a03afb02c49\") " pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.761955 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjhtn\" (UniqueName: \"kubernetes.io/projected/7e685daf-cafd-47fe-8640-34a654d4bb62-kube-api-access-hjhtn\") pod \"certified-operators-q7mzv\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.762049 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-utilities\") pod \"certified-operators-q7mzv\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.767996 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-utilities\") pod \"certified-operators-q7mzv\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.768712 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-catalog-content\") pod \"certified-operators-q7mzv\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.788922 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjhtn\" (UniqueName: \"kubernetes.io/projected/7e685daf-cafd-47fe-8640-34a654d4bb62-kube-api-access-hjhtn\") pod \"certified-operators-q7mzv\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.863986 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602c7fd6-d47b-4b05-880b-7a03afb02c49-config-data\") pod \"nova-scheduler-0\" (UID: \"602c7fd6-d47b-4b05-880b-7a03afb02c49\") " pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.864049 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c48wz\" (UniqueName: \"kubernetes.io/projected/602c7fd6-d47b-4b05-880b-7a03afb02c49-kube-api-access-c48wz\") pod \"nova-scheduler-0\" (UID: \"602c7fd6-d47b-4b05-880b-7a03afb02c49\") " pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.864103 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602c7fd6-d47b-4b05-880b-7a03afb02c49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"602c7fd6-d47b-4b05-880b-7a03afb02c49\") " pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.869141 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/602c7fd6-d47b-4b05-880b-7a03afb02c49-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"602c7fd6-d47b-4b05-880b-7a03afb02c49\") " pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.869285 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/602c7fd6-d47b-4b05-880b-7a03afb02c49-config-data\") pod \"nova-scheduler-0\" (UID: \"602c7fd6-d47b-4b05-880b-7a03afb02c49\") " pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.881127 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c48wz\" (UniqueName: \"kubernetes.io/projected/602c7fd6-d47b-4b05-880b-7a03afb02c49-kube-api-access-c48wz\") pod \"nova-scheduler-0\" (UID: \"602c7fd6-d47b-4b05-880b-7a03afb02c49\") " pod="openstack/nova-scheduler-0" Mar 19 17:03:30 crc kubenswrapper[4918]: I0319 17:03:30.938419 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:31 crc kubenswrapper[4918]: I0319 17:03:31.080847 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 17:03:31 crc kubenswrapper[4918]: I0319 17:03:31.661286 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7mzv"] Mar 19 17:03:31 crc kubenswrapper[4918]: I0319 17:03:31.706370 4918 generic.go:334] "Generic (PLEG): container finished" podID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" containerID="73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa" exitCode=0 Mar 19 17:03:31 crc kubenswrapper[4918]: I0319 17:03:31.706431 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-882pt" event={"ID":"2e859955-3304-49f6-a1e7-0f2d0eea96ca","Type":"ContainerDied","Data":"73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa"} Mar 19 17:03:31 crc kubenswrapper[4918]: I0319 17:03:31.707817 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7mzv" event={"ID":"7e685daf-cafd-47fe-8640-34a654d4bb62","Type":"ContainerStarted","Data":"1902f2e6102d5ed4085aec38e2a257b90967565d5f9ea6fa5c5a0618108777db"} Mar 19 17:03:31 crc kubenswrapper[4918]: I0319 17:03:31.712153 4918 generic.go:334] "Generic (PLEG): container finished" podID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerID="8ffea604ee8670992f9c21a0358aa00f778c4ad7836fc1c4ce084b9c929d6415" exitCode=0 Mar 19 17:03:31 crc kubenswrapper[4918]: I0319 17:03:31.712203 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c864e7-f676-4c0d-894b-c31c175fccd2","Type":"ContainerDied","Data":"8ffea604ee8670992f9c21a0358aa00f778c4ad7836fc1c4ce084b9c929d6415"} Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.081505 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.270450 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.300140 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-config-data\") pod \"b6c864e7-f676-4c0d-894b-c31c175fccd2\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.300225 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kqxn\" (UniqueName: \"kubernetes.io/projected/b6c864e7-f676-4c0d-894b-c31c175fccd2-kube-api-access-2kqxn\") pod \"b6c864e7-f676-4c0d-894b-c31c175fccd2\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.300250 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c864e7-f676-4c0d-894b-c31c175fccd2-logs\") pod \"b6c864e7-f676-4c0d-894b-c31c175fccd2\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.300357 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-combined-ca-bundle\") pod \"b6c864e7-f676-4c0d-894b-c31c175fccd2\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.300471 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-nova-metadata-tls-certs\") pod \"b6c864e7-f676-4c0d-894b-c31c175fccd2\" (UID: \"b6c864e7-f676-4c0d-894b-c31c175fccd2\") " Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.311816 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c864e7-f676-4c0d-894b-c31c175fccd2-kube-api-access-2kqxn" (OuterVolumeSpecName: "kube-api-access-2kqxn") pod "b6c864e7-f676-4c0d-894b-c31c175fccd2" (UID: "b6c864e7-f676-4c0d-894b-c31c175fccd2"). InnerVolumeSpecName "kube-api-access-2kqxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.314512 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6c864e7-f676-4c0d-894b-c31c175fccd2-logs" (OuterVolumeSpecName: "logs") pod "b6c864e7-f676-4c0d-894b-c31c175fccd2" (UID: "b6c864e7-f676-4c0d-894b-c31c175fccd2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.341094 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-config-data" (OuterVolumeSpecName: "config-data") pod "b6c864e7-f676-4c0d-894b-c31c175fccd2" (UID: "b6c864e7-f676-4c0d-894b-c31c175fccd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.403559 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.403590 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kqxn\" (UniqueName: \"kubernetes.io/projected/b6c864e7-f676-4c0d-894b-c31c175fccd2-kube-api-access-2kqxn\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.403601 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c864e7-f676-4c0d-894b-c31c175fccd2-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.489487 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6c864e7-f676-4c0d-894b-c31c175fccd2" (UID: "b6c864e7-f676-4c0d-894b-c31c175fccd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.505765 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.530175 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b6c864e7-f676-4c0d-894b-c31c175fccd2" (UID: "b6c864e7-f676-4c0d-894b-c31c175fccd2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.599285 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7861f8ee-6769-49fa-b7ab-33d6ebb50eee" path="/var/lib/kubelet/pods/7861f8ee-6769-49fa-b7ab-33d6ebb50eee/volumes" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.607535 4918 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c864e7-f676-4c0d-894b-c31c175fccd2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.724581 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"602c7fd6-d47b-4b05-880b-7a03afb02c49","Type":"ContainerStarted","Data":"1a3a98f5af64db0b4660a0c07e586eb2d65950ea7098f40ab53597f22982e8e7"} Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.724857 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"602c7fd6-d47b-4b05-880b-7a03afb02c49","Type":"ContainerStarted","Data":"c48827bb5133137a97c7445669fa7617288db8e272ec00b1f6f55ac4cf5ab815"} Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.728663 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-882pt" event={"ID":"2e859955-3304-49f6-a1e7-0f2d0eea96ca","Type":"ContainerStarted","Data":"b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516"} Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.730711 4918 generic.go:334] "Generic (PLEG): container finished" podID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerID="3c673fb9bcd1a7fa6d7a90989dbd00542016118e90dbbc46e7a2b2e5a269ee22" exitCode=0 Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.730841 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7mzv" event={"ID":"7e685daf-cafd-47fe-8640-34a654d4bb62","Type":"ContainerDied","Data":"3c673fb9bcd1a7fa6d7a90989dbd00542016118e90dbbc46e7a2b2e5a269ee22"} Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.733294 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c864e7-f676-4c0d-894b-c31c175fccd2","Type":"ContainerDied","Data":"23199fbd7c22d1dc4e4d2ddcae5322966d9c310b55587c8526117d149618d677"} Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.733331 4918 scope.go:117] "RemoveContainer" containerID="8ffea604ee8670992f9c21a0358aa00f778c4ad7836fc1c4ce084b9c929d6415" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.733424 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.740468 4918 generic.go:334] "Generic (PLEG): container finished" podID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerID="3808d87ac79a77d49e39b5b5281130f71ef37c9240f472174ad713b9edb96034" exitCode=0 Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.740504 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4755aa2d-fc74-4d06-a4cc-9a185df5068c","Type":"ContainerDied","Data":"3808d87ac79a77d49e39b5b5281130f71ef37c9240f472174ad713b9edb96034"} Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.753178 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.753157233 podStartE2EDuration="2.753157233s" podCreationTimestamp="2026-03-19 17:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:03:32.742750319 +0000 UTC m=+1424.864949567" watchObservedRunningTime="2026-03-19 17:03:32.753157233 +0000 UTC m=+1424.875356481" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.781542 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-882pt" podStartSLOduration=3.222182551 podStartE2EDuration="6.781510997s" podCreationTimestamp="2026-03-19 17:03:26 +0000 UTC" firstStartedPulling="2026-03-19 17:03:28.660123539 +0000 UTC m=+1420.782322787" lastFinishedPulling="2026-03-19 17:03:32.219451985 +0000 UTC m=+1424.341651233" observedRunningTime="2026-03-19 17:03:32.769401797 +0000 UTC m=+1424.891601045" watchObservedRunningTime="2026-03-19 17:03:32.781510997 +0000 UTC m=+1424.903710245" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.788937 4918 scope.go:117] "RemoveContainer" containerID="1f2a5b41eac6f7be33f5468c45045c789902f582422311f5dbe3dc455552bf73" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.797923 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.815192 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.829593 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:03:32 crc kubenswrapper[4918]: E0319 17:03:32.830107 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerName="nova-metadata-log" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.830118 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerName="nova-metadata-log" Mar 19 17:03:32 crc kubenswrapper[4918]: E0319 17:03:32.830136 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerName="nova-metadata-metadata" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.830142 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerName="nova-metadata-metadata" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.830358 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerName="nova-metadata-log" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.830377 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c864e7-f676-4c0d-894b-c31c175fccd2" containerName="nova-metadata-metadata" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.831556 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.840289 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.843610 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.856005 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.919953 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2491f737-beec-4148-b143-1c83527b477a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.920118 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2491f737-beec-4148-b143-1c83527b477a-config-data\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.920178 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2491f737-beec-4148-b143-1c83527b477a-logs\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.920202 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2491f737-beec-4148-b143-1c83527b477a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:32 crc kubenswrapper[4918]: I0319 17:03:32.920369 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrzht\" (UniqueName: \"kubernetes.io/projected/2491f737-beec-4148-b143-1c83527b477a-kube-api-access-nrzht\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.021816 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrzht\" (UniqueName: \"kubernetes.io/projected/2491f737-beec-4148-b143-1c83527b477a-kube-api-access-nrzht\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.021917 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2491f737-beec-4148-b143-1c83527b477a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.022019 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2491f737-beec-4148-b143-1c83527b477a-config-data\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.022070 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2491f737-beec-4148-b143-1c83527b477a-logs\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.022096 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2491f737-beec-4148-b143-1c83527b477a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.023513 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2491f737-beec-4148-b143-1c83527b477a-logs\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.027799 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2491f737-beec-4148-b143-1c83527b477a-config-data\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.030104 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2491f737-beec-4148-b143-1c83527b477a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.032491 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2491f737-beec-4148-b143-1c83527b477a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.043913 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrzht\" (UniqueName: \"kubernetes.io/projected/2491f737-beec-4148-b143-1c83527b477a-kube-api-access-nrzht\") pod \"nova-metadata-0\" (UID: \"2491f737-beec-4148-b143-1c83527b477a\") " pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.181201 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.329115 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.431887 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-internal-tls-certs\") pod \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.431959 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-combined-ca-bundle\") pod \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.432013 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-config-data\") pod \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.432214 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4755aa2d-fc74-4d06-a4cc-9a185df5068c-logs\") pod \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.432301 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xb58\" (UniqueName: \"kubernetes.io/projected/4755aa2d-fc74-4d06-a4cc-9a185df5068c-kube-api-access-7xb58\") pod \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.432334 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-public-tls-certs\") pod \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\" (UID: \"4755aa2d-fc74-4d06-a4cc-9a185df5068c\") " Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.434023 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4755aa2d-fc74-4d06-a4cc-9a185df5068c-logs" (OuterVolumeSpecName: "logs") pod "4755aa2d-fc74-4d06-a4cc-9a185df5068c" (UID: "4755aa2d-fc74-4d06-a4cc-9a185df5068c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.440961 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4755aa2d-fc74-4d06-a4cc-9a185df5068c-kube-api-access-7xb58" (OuterVolumeSpecName: "kube-api-access-7xb58") pod "4755aa2d-fc74-4d06-a4cc-9a185df5068c" (UID: "4755aa2d-fc74-4d06-a4cc-9a185df5068c"). InnerVolumeSpecName "kube-api-access-7xb58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.475185 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4755aa2d-fc74-4d06-a4cc-9a185df5068c" (UID: "4755aa2d-fc74-4d06-a4cc-9a185df5068c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.484746 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-config-data" (OuterVolumeSpecName: "config-data") pod "4755aa2d-fc74-4d06-a4cc-9a185df5068c" (UID: "4755aa2d-fc74-4d06-a4cc-9a185df5068c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.535948 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.535978 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4755aa2d-fc74-4d06-a4cc-9a185df5068c-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.535991 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xb58\" (UniqueName: \"kubernetes.io/projected/4755aa2d-fc74-4d06-a4cc-9a185df5068c-kube-api-access-7xb58\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.536001 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.539852 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4755aa2d-fc74-4d06-a4cc-9a185df5068c" (UID: "4755aa2d-fc74-4d06-a4cc-9a185df5068c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.547690 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4755aa2d-fc74-4d06-a4cc-9a185df5068c" (UID: "4755aa2d-fc74-4d06-a4cc-9a185df5068c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.637887 4918 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.637911 4918 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4755aa2d-fc74-4d06-a4cc-9a185df5068c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.704613 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 17:03:33 crc kubenswrapper[4918]: W0319 17:03:33.705551 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2491f737_beec_4148_b143_1c83527b477a.slice/crio-ba58b2e1f49d3d1ed265f663b6f4655c06b4c4151d65b69e5bd43bad10336881 WatchSource:0}: Error finding container ba58b2e1f49d3d1ed265f663b6f4655c06b4c4151d65b69e5bd43bad10336881: Status 404 returned error can't find the container with id ba58b2e1f49d3d1ed265f663b6f4655c06b4c4151d65b69e5bd43bad10336881 Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.755461 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4755aa2d-fc74-4d06-a4cc-9a185df5068c","Type":"ContainerDied","Data":"26469251e5d0c033ab9cc2388811d13575817a7cb4c60ed4d70689670d5a9063"} Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.755570 4918 scope.go:117] "RemoveContainer" containerID="3808d87ac79a77d49e39b5b5281130f71ef37c9240f472174ad713b9edb96034" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.755696 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.762271 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2491f737-beec-4148-b143-1c83527b477a","Type":"ContainerStarted","Data":"ba58b2e1f49d3d1ed265f663b6f4655c06b4c4151d65b69e5bd43bad10336881"} Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.796565 4918 scope.go:117] "RemoveContainer" containerID="811b55828b1016901fa27a953a00d10f187168686182e4842c96e9065a56807e" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.824625 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.842443 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.871619 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:33 crc kubenswrapper[4918]: E0319 17:03:33.872257 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerName="nova-api-log" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.872279 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerName="nova-api-log" Mar 19 17:03:33 crc kubenswrapper[4918]: E0319 17:03:33.872301 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerName="nova-api-api" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.872309 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerName="nova-api-api" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.872684 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerName="nova-api-api" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.872743 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" containerName="nova-api-log" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.874217 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.878006 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.878166 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.878268 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.888681 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.945725 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.945817 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zt68\" (UniqueName: \"kubernetes.io/projected/c14a1ce8-e827-4652-9a21-43d9cbcbac47-kube-api-access-9zt68\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.945841 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14a1ce8-e827-4652-9a21-43d9cbcbac47-logs\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.945857 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-public-tls-certs\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.945880 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-config-data\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:33 crc kubenswrapper[4918]: I0319 17:03:33.945932 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.050779 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zt68\" (UniqueName: \"kubernetes.io/projected/c14a1ce8-e827-4652-9a21-43d9cbcbac47-kube-api-access-9zt68\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.051139 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14a1ce8-e827-4652-9a21-43d9cbcbac47-logs\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.051170 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-public-tls-certs\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.051201 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-config-data\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.051297 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.051449 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.052759 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c14a1ce8-e827-4652-9a21-43d9cbcbac47-logs\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.057513 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-public-tls-certs\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.058265 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.074929 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.075417 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zt68\" (UniqueName: \"kubernetes.io/projected/c14a1ce8-e827-4652-9a21-43d9cbcbac47-kube-api-access-9zt68\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.080824 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c14a1ce8-e827-4652-9a21-43d9cbcbac47-config-data\") pod \"nova-api-0\" (UID: \"c14a1ce8-e827-4652-9a21-43d9cbcbac47\") " pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.195435 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.598931 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4755aa2d-fc74-4d06-a4cc-9a185df5068c" path="/var/lib/kubelet/pods/4755aa2d-fc74-4d06-a4cc-9a185df5068c/volumes" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.599773 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c864e7-f676-4c0d-894b-c31c175fccd2" path="/var/lib/kubelet/pods/b6c864e7-f676-4c0d-894b-c31c175fccd2/volumes" Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.774421 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.779406 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2491f737-beec-4148-b143-1c83527b477a","Type":"ContainerStarted","Data":"4fb14ddea621f5ddb6828ea9eef024f963f34050dd34a4cd213a53c297e546d0"} Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.779461 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2491f737-beec-4148-b143-1c83527b477a","Type":"ContainerStarted","Data":"44b2bf4855c152d99be0c510099116e28fee3a0587332f0bda98b25f78ea4d51"} Mar 19 17:03:34 crc kubenswrapper[4918]: W0319 17:03:34.780881 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc14a1ce8_e827_4652_9a21_43d9cbcbac47.slice/crio-e5d948ac3904784eed8e82089a2cd23efb42f526ef0dcc1527c6f11b964ab365 WatchSource:0}: Error finding container e5d948ac3904784eed8e82089a2cd23efb42f526ef0dcc1527c6f11b964ab365: Status 404 returned error can't find the container with id e5d948ac3904784eed8e82089a2cd23efb42f526ef0dcc1527c6f11b964ab365 Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.782434 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7mzv" event={"ID":"7e685daf-cafd-47fe-8640-34a654d4bb62","Type":"ContainerStarted","Data":"275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059"} Mar 19 17:03:34 crc kubenswrapper[4918]: I0319 17:03:34.807787 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.8077690459999998 podStartE2EDuration="2.807769046s" podCreationTimestamp="2026-03-19 17:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:03:34.801256659 +0000 UTC m=+1426.923455917" watchObservedRunningTime="2026-03-19 17:03:34.807769046 +0000 UTC m=+1426.929968294" Mar 19 17:03:35 crc kubenswrapper[4918]: I0319 17:03:35.795275 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c14a1ce8-e827-4652-9a21-43d9cbcbac47","Type":"ContainerStarted","Data":"08befb40fc72544b1d53128c388ca27a94deb2694dc5b949a38d4d6027a64b64"} Mar 19 17:03:35 crc kubenswrapper[4918]: I0319 17:03:35.795619 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c14a1ce8-e827-4652-9a21-43d9cbcbac47","Type":"ContainerStarted","Data":"c6720996dacf739ade24b022ed717bb04cd868a037550d2562033114a38b63e3"} Mar 19 17:03:35 crc kubenswrapper[4918]: I0319 17:03:35.795635 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c14a1ce8-e827-4652-9a21-43d9cbcbac47","Type":"ContainerStarted","Data":"e5d948ac3904784eed8e82089a2cd23efb42f526ef0dcc1527c6f11b964ab365"} Mar 19 17:03:35 crc kubenswrapper[4918]: I0319 17:03:35.815797 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.81577214 podStartE2EDuration="2.81577214s" podCreationTimestamp="2026-03-19 17:03:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:03:35.811908876 +0000 UTC m=+1427.934108124" watchObservedRunningTime="2026-03-19 17:03:35.81577214 +0000 UTC m=+1427.937971388" Mar 19 17:03:36 crc kubenswrapper[4918]: I0319 17:03:36.081191 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 17:03:36 crc kubenswrapper[4918]: I0319 17:03:36.871783 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:36 crc kubenswrapper[4918]: I0319 17:03:36.872084 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:36 crc kubenswrapper[4918]: I0319 17:03:36.939254 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:37 crc kubenswrapper[4918]: I0319 17:03:37.820984 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e","Type":"ContainerStarted","Data":"96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b"} Mar 19 17:03:37 crc kubenswrapper[4918]: I0319 17:03:37.821055 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="proxy-httpd" containerID="cri-o://96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b" gracePeriod=30 Mar 19 17:03:37 crc kubenswrapper[4918]: I0319 17:03:37.821152 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="ceilometer-notification-agent" containerID="cri-o://1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb" gracePeriod=30 Mar 19 17:03:37 crc kubenswrapper[4918]: I0319 17:03:37.821156 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="sg-core" containerID="cri-o://e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd" gracePeriod=30 Mar 19 17:03:37 crc kubenswrapper[4918]: I0319 17:03:37.821510 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 17:03:37 crc kubenswrapper[4918]: I0319 17:03:37.823346 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="ceilometer-central-agent" containerID="cri-o://8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2" gracePeriod=30 Mar 19 17:03:37 crc kubenswrapper[4918]: I0319 17:03:37.859913 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.527355977 podStartE2EDuration="24.858587752s" podCreationTimestamp="2026-03-19 17:03:13 +0000 UTC" firstStartedPulling="2026-03-19 17:03:14.442983487 +0000 UTC m=+1406.565182735" lastFinishedPulling="2026-03-19 17:03:36.774215262 +0000 UTC m=+1428.896414510" observedRunningTime="2026-03-19 17:03:37.84935401 +0000 UTC m=+1429.971553268" watchObservedRunningTime="2026-03-19 17:03:37.858587752 +0000 UTC m=+1429.980787000" Mar 19 17:03:37 crc kubenswrapper[4918]: I0319 17:03:37.905025 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:38 crc kubenswrapper[4918]: I0319 17:03:38.832381 4918 generic.go:334] "Generic (PLEG): container finished" podID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerID="96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b" exitCode=0 Mar 19 17:03:38 crc kubenswrapper[4918]: I0319 17:03:38.832409 4918 generic.go:334] "Generic (PLEG): container finished" podID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerID="e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd" exitCode=2 Mar 19 17:03:38 crc kubenswrapper[4918]: I0319 17:03:38.832417 4918 generic.go:334] "Generic (PLEG): container finished" podID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerID="8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2" exitCode=0 Mar 19 17:03:38 crc kubenswrapper[4918]: I0319 17:03:38.832466 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e","Type":"ContainerDied","Data":"96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b"} Mar 19 17:03:38 crc kubenswrapper[4918]: I0319 17:03:38.832531 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e","Type":"ContainerDied","Data":"e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd"} Mar 19 17:03:38 crc kubenswrapper[4918]: I0319 17:03:38.832550 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e","Type":"ContainerDied","Data":"8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2"} Mar 19 17:03:38 crc kubenswrapper[4918]: I0319 17:03:38.834597 4918 generic.go:334] "Generic (PLEG): container finished" podID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerID="275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059" exitCode=0 Mar 19 17:03:38 crc kubenswrapper[4918]: I0319 17:03:38.834674 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7mzv" event={"ID":"7e685daf-cafd-47fe-8640-34a654d4bb62","Type":"ContainerDied","Data":"275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059"} Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.325934 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-882pt"] Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.737601 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.847145 4918 generic.go:334] "Generic (PLEG): container finished" podID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerID="1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb" exitCode=0 Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.847215 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e","Type":"ContainerDied","Data":"1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb"} Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.847283 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e","Type":"ContainerDied","Data":"49eacfec3c64fb5adf21222e8e3d25c3837769eeefdf8f893c363b6e49acc5a6"} Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.847313 4918 scope.go:117] "RemoveContainer" containerID="96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b" Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.847398 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-882pt" podUID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" containerName="registry-server" containerID="cri-o://b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516" gracePeriod=2 Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.847581 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.871159 4918 scope.go:117] "RemoveContainer" containerID="e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd" Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.894215 4918 scope.go:117] "RemoveContainer" containerID="1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb" Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.909264 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-config-data\") pod \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.909501 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-combined-ca-bundle\") pod \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.909686 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-scripts\") pod \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.909868 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-log-httpd\") pod \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.910024 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-ceilometer-tls-certs\") pod \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.910212 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" (UID: "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.910351 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgrk2\" (UniqueName: \"kubernetes.io/projected/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-kube-api-access-pgrk2\") pod \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.910493 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-run-httpd\") pod \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.910614 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-sg-core-conf-yaml\") pod \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\" (UID: \"a1e607ec-e059-4abf-ad48-7e0a9a39dc3e\") " Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.911241 4918 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.912143 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" (UID: "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.918809 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-kube-api-access-pgrk2" (OuterVolumeSpecName: "kube-api-access-pgrk2") pod "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" (UID: "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e"). InnerVolumeSpecName "kube-api-access-pgrk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.921843 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-scripts" (OuterVolumeSpecName: "scripts") pod "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" (UID: "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:39 crc kubenswrapper[4918]: I0319 17:03:39.950431 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" (UID: "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.003715 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" (UID: "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.019458 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgrk2\" (UniqueName: \"kubernetes.io/projected/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-kube-api-access-pgrk2\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.019505 4918 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.019534 4918 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.019548 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.019560 4918 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.039824 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" (UID: "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.124774 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.128117 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-config-data" (OuterVolumeSpecName: "config-data") pod "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" (UID: "a1e607ec-e059-4abf-ad48-7e0a9a39dc3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.160969 4918 scope.go:117] "RemoveContainer" containerID="8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.226558 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.226936 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.233996 4918 scope.go:117] "RemoveContainer" containerID="96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b" Mar 19 17:03:40 crc kubenswrapper[4918]: E0319 17:03:40.242641 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b\": container with ID starting with 96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b not found: ID does not exist" containerID="96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.242678 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b"} err="failed to get container status \"96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b\": rpc error: code = NotFound desc = could not find container \"96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b\": container with ID starting with 96b2faf2c6d969661c1d36da69640bb464f5e8b455c6dc0ad880aeb7edd9631b not found: ID does not exist" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.242698 4918 scope.go:117] "RemoveContainer" containerID="e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd" Mar 19 17:03:40 crc kubenswrapper[4918]: E0319 17:03:40.250905 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd\": container with ID starting with e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd not found: ID does not exist" containerID="e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.250949 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd"} err="failed to get container status \"e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd\": rpc error: code = NotFound desc = could not find container \"e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd\": container with ID starting with e263df42abda3a1af695f033f9759b1e6d27deeb988c1bdf164e6cc66fe785bd not found: ID does not exist" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.250972 4918 scope.go:117] "RemoveContainer" containerID="1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb" Mar 19 17:03:40 crc kubenswrapper[4918]: E0319 17:03:40.252822 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb\": container with ID starting with 1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb not found: ID does not exist" containerID="1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.252850 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb"} err="failed to get container status \"1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb\": rpc error: code = NotFound desc = could not find container \"1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb\": container with ID starting with 1f95ac2f8d659d5778fc3ed0566cf1732b0a27c48ffa7197c132a9d7580f80fb not found: ID does not exist" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.252867 4918 scope.go:117] "RemoveContainer" containerID="8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2" Mar 19 17:03:40 crc kubenswrapper[4918]: E0319 17:03:40.254106 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2\": container with ID starting with 8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2 not found: ID does not exist" containerID="8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.254123 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2"} err="failed to get container status \"8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2\": rpc error: code = NotFound desc = could not find container \"8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2\": container with ID starting with 8283b0349263f00dfd0c1051342e78e13c1311b680db3a95885f5d2ffbb5c3d2 not found: ID does not exist" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.264291 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.279898 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:40 crc kubenswrapper[4918]: E0319 17:03:40.280339 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="sg-core" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.280350 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="sg-core" Mar 19 17:03:40 crc kubenswrapper[4918]: E0319 17:03:40.280389 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="ceilometer-notification-agent" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.280395 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="ceilometer-notification-agent" Mar 19 17:03:40 crc kubenswrapper[4918]: E0319 17:03:40.280402 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="proxy-httpd" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.280408 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="proxy-httpd" Mar 19 17:03:40 crc kubenswrapper[4918]: E0319 17:03:40.280421 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="ceilometer-central-agent" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.280427 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="ceilometer-central-agent" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.280654 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="proxy-httpd" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.280673 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="ceilometer-central-agent" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.280686 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="ceilometer-notification-agent" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.280701 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" containerName="sg-core" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.282810 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.287336 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.287474 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.287623 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.293240 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.328097 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-log-httpd\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.328159 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.328187 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-scripts\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.328225 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7sbl\" (UniqueName: \"kubernetes.io/projected/4e695151-c2fe-4d41-9150-f10045d9dad1-kube-api-access-r7sbl\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.328248 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.328264 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.328337 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-config-data\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.328353 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-run-httpd\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.430153 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-config-data\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.430189 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-run-httpd\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.430234 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-log-httpd\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.430278 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.430308 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-scripts\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.431761 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-log-httpd\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.431980 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-run-httpd\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.432008 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7sbl\" (UniqueName: \"kubernetes.io/projected/4e695151-c2fe-4d41-9150-f10045d9dad1-kube-api-access-r7sbl\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.432050 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.432072 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.439410 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-scripts\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.439913 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.440486 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-config-data\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.450538 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.451230 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7sbl\" (UniqueName: \"kubernetes.io/projected/4e695151-c2fe-4d41-9150-f10045d9dad1-kube-api-access-r7sbl\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.456252 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.602861 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1e607ec-e059-4abf-ad48-7e0a9a39dc3e" path="/var/lib/kubelet/pods/a1e607ec-e059-4abf-ad48-7e0a9a39dc3e/volumes" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.627474 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.629872 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.739216 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-utilities\") pod \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.739380 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-catalog-content\") pod \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.739662 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49ksz\" (UniqueName: \"kubernetes.io/projected/2e859955-3304-49f6-a1e7-0f2d0eea96ca-kube-api-access-49ksz\") pod \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\" (UID: \"2e859955-3304-49f6-a1e7-0f2d0eea96ca\") " Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.740283 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-utilities" (OuterVolumeSpecName: "utilities") pod "2e859955-3304-49f6-a1e7-0f2d0eea96ca" (UID: "2e859955-3304-49f6-a1e7-0f2d0eea96ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.767171 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e859955-3304-49f6-a1e7-0f2d0eea96ca-kube-api-access-49ksz" (OuterVolumeSpecName: "kube-api-access-49ksz") pod "2e859955-3304-49f6-a1e7-0f2d0eea96ca" (UID: "2e859955-3304-49f6-a1e7-0f2d0eea96ca"). InnerVolumeSpecName "kube-api-access-49ksz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.767204 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e859955-3304-49f6-a1e7-0f2d0eea96ca" (UID: "2e859955-3304-49f6-a1e7-0f2d0eea96ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.842323 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.842363 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e859955-3304-49f6-a1e7-0f2d0eea96ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.842380 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49ksz\" (UniqueName: \"kubernetes.io/projected/2e859955-3304-49f6-a1e7-0f2d0eea96ca-kube-api-access-49ksz\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.868088 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7mzv" event={"ID":"7e685daf-cafd-47fe-8640-34a654d4bb62","Type":"ContainerStarted","Data":"75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf"} Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.875413 4918 generic.go:334] "Generic (PLEG): container finished" podID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" containerID="b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516" exitCode=0 Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.875464 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-882pt" event={"ID":"2e859955-3304-49f6-a1e7-0f2d0eea96ca","Type":"ContainerDied","Data":"b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516"} Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.875494 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-882pt" event={"ID":"2e859955-3304-49f6-a1e7-0f2d0eea96ca","Type":"ContainerDied","Data":"9ab2fa9556dfbe84863284aa453d9b4ba5c33b251ee0b8d1080f22ddc39f427c"} Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.875513 4918 scope.go:117] "RemoveContainer" containerID="b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.876307 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-882pt" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.902965 4918 scope.go:117] "RemoveContainer" containerID="73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.917391 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q7mzv" podStartSLOduration=3.621826731 podStartE2EDuration="10.917373815s" podCreationTimestamp="2026-03-19 17:03:30 +0000 UTC" firstStartedPulling="2026-03-19 17:03:32.732500699 +0000 UTC m=+1424.854699947" lastFinishedPulling="2026-03-19 17:03:40.028047783 +0000 UTC m=+1432.150247031" observedRunningTime="2026-03-19 17:03:40.89229752 +0000 UTC m=+1433.014496768" watchObservedRunningTime="2026-03-19 17:03:40.917373815 +0000 UTC m=+1433.039573063" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.929377 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-882pt"] Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.933264 4918 scope.go:117] "RemoveContainer" containerID="43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.939108 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.939304 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.941946 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-882pt"] Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.969816 4918 scope.go:117] "RemoveContainer" containerID="b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516" Mar 19 17:03:40 crc kubenswrapper[4918]: E0319 17:03:40.970427 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516\": container with ID starting with b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516 not found: ID does not exist" containerID="b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.970482 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516"} err="failed to get container status \"b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516\": rpc error: code = NotFound desc = could not find container \"b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516\": container with ID starting with b8cca7800568c0481ff45a44147286320c155b527f826878f72f78a8d4734516 not found: ID does not exist" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.970514 4918 scope.go:117] "RemoveContainer" containerID="73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa" Mar 19 17:03:40 crc kubenswrapper[4918]: E0319 17:03:40.971197 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa\": container with ID starting with 73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa not found: ID does not exist" containerID="73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.971229 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa"} err="failed to get container status \"73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa\": rpc error: code = NotFound desc = could not find container \"73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa\": container with ID starting with 73cde1511e58b19f02b459b8de9dcb67e15d7e1793e952d87c9b7fe5c78d85aa not found: ID does not exist" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.971258 4918 scope.go:117] "RemoveContainer" containerID="43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1" Mar 19 17:03:40 crc kubenswrapper[4918]: E0319 17:03:40.971872 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1\": container with ID starting with 43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1 not found: ID does not exist" containerID="43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1" Mar 19 17:03:40 crc kubenswrapper[4918]: I0319 17:03:40.971927 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1"} err="failed to get container status \"43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1\": rpc error: code = NotFound desc = could not find container \"43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1\": container with ID starting with 43869c142ac051046287a47ff92da5970545b11fc585e7c1eccae513d1c3bee1 not found: ID does not exist" Mar 19 17:03:41 crc kubenswrapper[4918]: I0319 17:03:41.082313 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 17:03:41 crc kubenswrapper[4918]: I0319 17:03:41.118732 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 17:03:41 crc kubenswrapper[4918]: I0319 17:03:41.138841 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:03:41 crc kubenswrapper[4918]: I0319 17:03:41.888131 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e695151-c2fe-4d41-9150-f10045d9dad1","Type":"ContainerStarted","Data":"4455aeee863739c688f670726638409bdfe23d3ae4ed79b19e496e5249f4a7b4"} Mar 19 17:03:41 crc kubenswrapper[4918]: I0319 17:03:41.932594 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 17:03:41 crc kubenswrapper[4918]: I0319 17:03:41.994019 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-q7mzv" podUID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerName="registry-server" probeResult="failure" output=< Mar 19 17:03:41 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 17:03:41 crc kubenswrapper[4918]: > Mar 19 17:03:42 crc kubenswrapper[4918]: I0319 17:03:42.598905 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" path="/var/lib/kubelet/pods/2e859955-3304-49f6-a1e7-0f2d0eea96ca/volumes" Mar 19 17:03:42 crc kubenswrapper[4918]: I0319 17:03:42.898715 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e695151-c2fe-4d41-9150-f10045d9dad1","Type":"ContainerStarted","Data":"f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b"} Mar 19 17:03:43 crc kubenswrapper[4918]: I0319 17:03:43.182969 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 17:03:43 crc kubenswrapper[4918]: I0319 17:03:43.183006 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 17:03:43 crc kubenswrapper[4918]: I0319 17:03:43.915472 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e695151-c2fe-4d41-9150-f10045d9dad1","Type":"ContainerStarted","Data":"837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32"} Mar 19 17:03:44 crc kubenswrapper[4918]: I0319 17:03:44.196396 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 17:03:44 crc kubenswrapper[4918]: I0319 17:03:44.196472 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 17:03:44 crc kubenswrapper[4918]: I0319 17:03:44.198787 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2491f737-beec-4148-b143-1c83527b477a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.239:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 17:03:44 crc kubenswrapper[4918]: I0319 17:03:44.198830 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2491f737-beec-4148-b143-1c83527b477a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.239:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 17:03:45 crc kubenswrapper[4918]: I0319 17:03:45.208726 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c14a1ce8-e827-4652-9a21-43d9cbcbac47" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.240:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 17:03:45 crc kubenswrapper[4918]: I0319 17:03:45.208749 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c14a1ce8-e827-4652-9a21-43d9cbcbac47" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.240:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 17:03:45 crc kubenswrapper[4918]: I0319 17:03:45.941277 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e695151-c2fe-4d41-9150-f10045d9dad1","Type":"ContainerStarted","Data":"9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f"} Mar 19 17:03:47 crc kubenswrapper[4918]: I0319 17:03:47.988654 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e695151-c2fe-4d41-9150-f10045d9dad1","Type":"ContainerStarted","Data":"13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6"} Mar 19 17:03:47 crc kubenswrapper[4918]: I0319 17:03:47.989178 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 17:03:48 crc kubenswrapper[4918]: I0319 17:03:48.026165 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.038054818 podStartE2EDuration="8.026142517s" podCreationTimestamp="2026-03-19 17:03:40 +0000 UTC" firstStartedPulling="2026-03-19 17:03:41.153964318 +0000 UTC m=+1433.276163566" lastFinishedPulling="2026-03-19 17:03:47.142052017 +0000 UTC m=+1439.264251265" observedRunningTime="2026-03-19 17:03:48.010859729 +0000 UTC m=+1440.133058987" watchObservedRunningTime="2026-03-19 17:03:48.026142517 +0000 UTC m=+1440.148341765" Mar 19 17:03:51 crc kubenswrapper[4918]: I0319 17:03:51.006283 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:51 crc kubenswrapper[4918]: I0319 17:03:51.077236 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:51 crc kubenswrapper[4918]: I0319 17:03:51.181994 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 17:03:51 crc kubenswrapper[4918]: I0319 17:03:51.182051 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 17:03:51 crc kubenswrapper[4918]: I0319 17:03:51.251237 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7mzv"] Mar 19 17:03:52 crc kubenswrapper[4918]: I0319 17:03:52.036584 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q7mzv" podUID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerName="registry-server" containerID="cri-o://75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf" gracePeriod=2 Mar 19 17:03:52 crc kubenswrapper[4918]: I0319 17:03:52.195714 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 17:03:52 crc kubenswrapper[4918]: I0319 17:03:52.195781 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 17:03:52 crc kubenswrapper[4918]: I0319 17:03:52.916036 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.015566 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-utilities\") pod \"7e685daf-cafd-47fe-8640-34a654d4bb62\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.015863 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjhtn\" (UniqueName: \"kubernetes.io/projected/7e685daf-cafd-47fe-8640-34a654d4bb62-kube-api-access-hjhtn\") pod \"7e685daf-cafd-47fe-8640-34a654d4bb62\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.015922 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-catalog-content\") pod \"7e685daf-cafd-47fe-8640-34a654d4bb62\" (UID: \"7e685daf-cafd-47fe-8640-34a654d4bb62\") " Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.016590 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-utilities" (OuterVolumeSpecName: "utilities") pod "7e685daf-cafd-47fe-8640-34a654d4bb62" (UID: "7e685daf-cafd-47fe-8640-34a654d4bb62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.017692 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.030804 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e685daf-cafd-47fe-8640-34a654d4bb62-kube-api-access-hjhtn" (OuterVolumeSpecName: "kube-api-access-hjhtn") pod "7e685daf-cafd-47fe-8640-34a654d4bb62" (UID: "7e685daf-cafd-47fe-8640-34a654d4bb62"). InnerVolumeSpecName "kube-api-access-hjhtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.055406 4918 generic.go:334] "Generic (PLEG): container finished" podID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerID="75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf" exitCode=0 Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.056542 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7mzv" event={"ID":"7e685daf-cafd-47fe-8640-34a654d4bb62","Type":"ContainerDied","Data":"75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf"} Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.056628 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7mzv" event={"ID":"7e685daf-cafd-47fe-8640-34a654d4bb62","Type":"ContainerDied","Data":"1902f2e6102d5ed4085aec38e2a257b90967565d5f9ea6fa5c5a0618108777db"} Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.056745 4918 scope.go:117] "RemoveContainer" containerID="75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.056985 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7mzv" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.078116 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e685daf-cafd-47fe-8640-34a654d4bb62" (UID: "7e685daf-cafd-47fe-8640-34a654d4bb62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.105909 4918 scope.go:117] "RemoveContainer" containerID="275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.119871 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjhtn\" (UniqueName: \"kubernetes.io/projected/7e685daf-cafd-47fe-8640-34a654d4bb62-kube-api-access-hjhtn\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.119915 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e685daf-cafd-47fe-8640-34a654d4bb62-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.131567 4918 scope.go:117] "RemoveContainer" containerID="3c673fb9bcd1a7fa6d7a90989dbd00542016118e90dbbc46e7a2b2e5a269ee22" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.180007 4918 scope.go:117] "RemoveContainer" containerID="75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf" Mar 19 17:03:53 crc kubenswrapper[4918]: E0319 17:03:53.180741 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf\": container with ID starting with 75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf not found: ID does not exist" containerID="75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.180876 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf"} err="failed to get container status \"75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf\": rpc error: code = NotFound desc = could not find container \"75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf\": container with ID starting with 75a138c0efc6a2c43ebb30883b8bd687653a60750b7c2ba60caf38864c0a0dbf not found: ID does not exist" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.180972 4918 scope.go:117] "RemoveContainer" containerID="275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059" Mar 19 17:03:53 crc kubenswrapper[4918]: E0319 17:03:53.181562 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059\": container with ID starting with 275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059 not found: ID does not exist" containerID="275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.181617 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059"} err="failed to get container status \"275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059\": rpc error: code = NotFound desc = could not find container \"275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059\": container with ID starting with 275e5bdf4abf8a7d861a987ad9ee70a95d3c81f9a2f989c0de47da32f9606059 not found: ID does not exist" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.181654 4918 scope.go:117] "RemoveContainer" containerID="3c673fb9bcd1a7fa6d7a90989dbd00542016118e90dbbc46e7a2b2e5a269ee22" Mar 19 17:03:53 crc kubenswrapper[4918]: E0319 17:03:53.182093 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c673fb9bcd1a7fa6d7a90989dbd00542016118e90dbbc46e7a2b2e5a269ee22\": container with ID starting with 3c673fb9bcd1a7fa6d7a90989dbd00542016118e90dbbc46e7a2b2e5a269ee22 not found: ID does not exist" containerID="3c673fb9bcd1a7fa6d7a90989dbd00542016118e90dbbc46e7a2b2e5a269ee22" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.182150 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c673fb9bcd1a7fa6d7a90989dbd00542016118e90dbbc46e7a2b2e5a269ee22"} err="failed to get container status \"3c673fb9bcd1a7fa6d7a90989dbd00542016118e90dbbc46e7a2b2e5a269ee22\": rpc error: code = NotFound desc = could not find container \"3c673fb9bcd1a7fa6d7a90989dbd00542016118e90dbbc46e7a2b2e5a269ee22\": container with ID starting with 3c673fb9bcd1a7fa6d7a90989dbd00542016118e90dbbc46e7a2b2e5a269ee22 not found: ID does not exist" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.187219 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.194417 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.228642 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.395902 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7mzv"] Mar 19 17:03:53 crc kubenswrapper[4918]: I0319 17:03:53.407775 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q7mzv"] Mar 19 17:03:54 crc kubenswrapper[4918]: I0319 17:03:54.082156 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 17:03:54 crc kubenswrapper[4918]: I0319 17:03:54.436269 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 17:03:54 crc kubenswrapper[4918]: I0319 17:03:54.462163 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 17:03:54 crc kubenswrapper[4918]: I0319 17:03:54.474918 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 17:03:54 crc kubenswrapper[4918]: I0319 17:03:54.600537 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e685daf-cafd-47fe-8640-34a654d4bb62" path="/var/lib/kubelet/pods/7e685daf-cafd-47fe-8640-34a654d4bb62/volumes" Mar 19 17:03:55 crc kubenswrapper[4918]: I0319 17:03:55.084777 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.152146 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565664-rrn6b"] Mar 19 17:04:00 crc kubenswrapper[4918]: E0319 17:04:00.153342 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" containerName="registry-server" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.153361 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" containerName="registry-server" Mar 19 17:04:00 crc kubenswrapper[4918]: E0319 17:04:00.153375 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" containerName="extract-utilities" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.153384 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" containerName="extract-utilities" Mar 19 17:04:00 crc kubenswrapper[4918]: E0319 17:04:00.153408 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerName="extract-content" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.153418 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerName="extract-content" Mar 19 17:04:00 crc kubenswrapper[4918]: E0319 17:04:00.153445 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerName="registry-server" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.153453 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerName="registry-server" Mar 19 17:04:00 crc kubenswrapper[4918]: E0319 17:04:00.153470 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerName="extract-utilities" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.153478 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerName="extract-utilities" Mar 19 17:04:00 crc kubenswrapper[4918]: E0319 17:04:00.153491 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" containerName="extract-content" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.153499 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" containerName="extract-content" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.153795 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e685daf-cafd-47fe-8640-34a654d4bb62" containerName="registry-server" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.153820 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e859955-3304-49f6-a1e7-0f2d0eea96ca" containerName="registry-server" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.154790 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565664-rrn6b" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.156610 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.156806 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.157629 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.167174 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565664-rrn6b"] Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.283583 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcnqf\" (UniqueName: \"kubernetes.io/projected/db2a848b-d81b-467c-b100-159fc77dd610-kube-api-access-dcnqf\") pod \"auto-csr-approver-29565664-rrn6b\" (UID: \"db2a848b-d81b-467c-b100-159fc77dd610\") " pod="openshift-infra/auto-csr-approver-29565664-rrn6b" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.385221 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcnqf\" (UniqueName: \"kubernetes.io/projected/db2a848b-d81b-467c-b100-159fc77dd610-kube-api-access-dcnqf\") pod \"auto-csr-approver-29565664-rrn6b\" (UID: \"db2a848b-d81b-467c-b100-159fc77dd610\") " pod="openshift-infra/auto-csr-approver-29565664-rrn6b" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.418897 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcnqf\" (UniqueName: \"kubernetes.io/projected/db2a848b-d81b-467c-b100-159fc77dd610-kube-api-access-dcnqf\") pod \"auto-csr-approver-29565664-rrn6b\" (UID: \"db2a848b-d81b-467c-b100-159fc77dd610\") " pod="openshift-infra/auto-csr-approver-29565664-rrn6b" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.484506 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565664-rrn6b" Mar 19 17:04:00 crc kubenswrapper[4918]: I0319 17:04:00.964041 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565664-rrn6b"] Mar 19 17:04:00 crc kubenswrapper[4918]: W0319 17:04:00.972701 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb2a848b_d81b_467c_b100_159fc77dd610.slice/crio-124d30bd3f254f0e8838b16ca19bc7f0b7fa9726de3a3d8db60acb4d3ba3da52 WatchSource:0}: Error finding container 124d30bd3f254f0e8838b16ca19bc7f0b7fa9726de3a3d8db60acb4d3ba3da52: Status 404 returned error can't find the container with id 124d30bd3f254f0e8838b16ca19bc7f0b7fa9726de3a3d8db60acb4d3ba3da52 Mar 19 17:04:01 crc kubenswrapper[4918]: I0319 17:04:01.331022 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565664-rrn6b" event={"ID":"db2a848b-d81b-467c-b100-159fc77dd610","Type":"ContainerStarted","Data":"124d30bd3f254f0e8838b16ca19bc7f0b7fa9726de3a3d8db60acb4d3ba3da52"} Mar 19 17:04:02 crc kubenswrapper[4918]: I0319 17:04:02.420740 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565664-rrn6b" event={"ID":"db2a848b-d81b-467c-b100-159fc77dd610","Type":"ContainerStarted","Data":"a29059ded9ff42bd4ad3193cd68efe1dcc3b1c77c2a12e1f44bb92c3748cc7e1"} Mar 19 17:04:05 crc kubenswrapper[4918]: I0319 17:04:05.471553 4918 generic.go:334] "Generic (PLEG): container finished" podID="db2a848b-d81b-467c-b100-159fc77dd610" containerID="a29059ded9ff42bd4ad3193cd68efe1dcc3b1c77c2a12e1f44bb92c3748cc7e1" exitCode=0 Mar 19 17:04:05 crc kubenswrapper[4918]: I0319 17:04:05.471639 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565664-rrn6b" event={"ID":"db2a848b-d81b-467c-b100-159fc77dd610","Type":"ContainerDied","Data":"a29059ded9ff42bd4ad3193cd68efe1dcc3b1c77c2a12e1f44bb92c3748cc7e1"} Mar 19 17:04:07 crc kubenswrapper[4918]: I0319 17:04:07.289157 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565664-rrn6b" Mar 19 17:04:07 crc kubenswrapper[4918]: I0319 17:04:07.460364 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcnqf\" (UniqueName: \"kubernetes.io/projected/db2a848b-d81b-467c-b100-159fc77dd610-kube-api-access-dcnqf\") pod \"db2a848b-d81b-467c-b100-159fc77dd610\" (UID: \"db2a848b-d81b-467c-b100-159fc77dd610\") " Mar 19 17:04:07 crc kubenswrapper[4918]: I0319 17:04:07.466688 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db2a848b-d81b-467c-b100-159fc77dd610-kube-api-access-dcnqf" (OuterVolumeSpecName: "kube-api-access-dcnqf") pod "db2a848b-d81b-467c-b100-159fc77dd610" (UID: "db2a848b-d81b-467c-b100-159fc77dd610"). InnerVolumeSpecName "kube-api-access-dcnqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:04:07 crc kubenswrapper[4918]: I0319 17:04:07.508354 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565664-rrn6b" event={"ID":"db2a848b-d81b-467c-b100-159fc77dd610","Type":"ContainerDied","Data":"124d30bd3f254f0e8838b16ca19bc7f0b7fa9726de3a3d8db60acb4d3ba3da52"} Mar 19 17:04:07 crc kubenswrapper[4918]: I0319 17:04:07.508423 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="124d30bd3f254f0e8838b16ca19bc7f0b7fa9726de3a3d8db60acb4d3ba3da52" Mar 19 17:04:07 crc kubenswrapper[4918]: I0319 17:04:07.508507 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565664-rrn6b" Mar 19 17:04:07 crc kubenswrapper[4918]: I0319 17:04:07.563505 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcnqf\" (UniqueName: \"kubernetes.io/projected/db2a848b-d81b-467c-b100-159fc77dd610-kube-api-access-dcnqf\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:07 crc kubenswrapper[4918]: I0319 17:04:07.566197 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565658-jggqm"] Mar 19 17:04:07 crc kubenswrapper[4918]: I0319 17:04:07.577091 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565658-jggqm"] Mar 19 17:04:08 crc kubenswrapper[4918]: I0319 17:04:08.599370 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c59e2850-c18c-4082-bb7c-22509bd97ec2" path="/var/lib/kubelet/pods/c59e2850-c18c-4082-bb7c-22509bd97ec2/volumes" Mar 19 17:04:10 crc kubenswrapper[4918]: I0319 17:04:10.654849 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 17:04:21 crc kubenswrapper[4918]: I0319 17:04:21.917174 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-d9qg5"] Mar 19 17:04:21 crc kubenswrapper[4918]: I0319 17:04:21.926625 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-d9qg5"] Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.046832 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-cm55x"] Mar 19 17:04:22 crc kubenswrapper[4918]: E0319 17:04:22.047301 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db2a848b-d81b-467c-b100-159fc77dd610" containerName="oc" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.047318 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2a848b-d81b-467c-b100-159fc77dd610" containerName="oc" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.047551 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="db2a848b-d81b-467c-b100-159fc77dd610" containerName="oc" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.048269 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.051079 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.066256 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-cm55x"] Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.188538 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8fj4\" (UniqueName: \"kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-kube-api-access-x8fj4\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.188593 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-config-data\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.188769 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-combined-ca-bundle\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.188797 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-scripts\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.188825 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-certs\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.290604 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-combined-ca-bundle\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.290651 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-scripts\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.290693 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-certs\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.290784 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8fj4\" (UniqueName: \"kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-kube-api-access-x8fj4\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.290811 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-config-data\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.298816 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-config-data\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.302638 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-scripts\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.302788 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-combined-ca-bundle\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.312635 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8fj4\" (UniqueName: \"kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-kube-api-access-x8fj4\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.313471 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-certs\") pod \"cloudkitty-db-sync-cm55x\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.365587 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.618770 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5df5afd-edbf-49fd-b9b8-35aa33fb5d25" path="/var/lib/kubelet/pods/a5df5afd-edbf-49fd-b9b8-35aa33fb5d25/volumes" Mar 19 17:04:22 crc kubenswrapper[4918]: I0319 17:04:22.983333 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-cm55x"] Mar 19 17:04:23 crc kubenswrapper[4918]: I0319 17:04:23.691765 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-cm55x" event={"ID":"79829597-ef66-4d6f-946f-adaa9ec3d227","Type":"ContainerStarted","Data":"1a4ab0f85b6dee62eeb72c6ad024a5912ff65209b3d79553a824c4a95fdbe9d4"} Mar 19 17:04:23 crc kubenswrapper[4918]: I0319 17:04:23.778188 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 17:04:24 crc kubenswrapper[4918]: I0319 17:04:24.671685 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 17:04:25 crc kubenswrapper[4918]: I0319 17:04:25.143514 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:04:25 crc kubenswrapper[4918]: I0319 17:04:25.143881 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="ceilometer-central-agent" containerID="cri-o://f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b" gracePeriod=30 Mar 19 17:04:25 crc kubenswrapper[4918]: I0319 17:04:25.144032 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="ceilometer-notification-agent" containerID="cri-o://837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32" gracePeriod=30 Mar 19 17:04:25 crc kubenswrapper[4918]: I0319 17:04:25.144055 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="proxy-httpd" containerID="cri-o://13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6" gracePeriod=30 Mar 19 17:04:25 crc kubenswrapper[4918]: I0319 17:04:25.143996 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="sg-core" containerID="cri-o://9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f" gracePeriod=30 Mar 19 17:04:25 crc kubenswrapper[4918]: I0319 17:04:25.741679 4918 generic.go:334] "Generic (PLEG): container finished" podID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerID="13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6" exitCode=0 Mar 19 17:04:25 crc kubenswrapper[4918]: I0319 17:04:25.742856 4918 generic.go:334] "Generic (PLEG): container finished" podID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerID="9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f" exitCode=2 Mar 19 17:04:25 crc kubenswrapper[4918]: I0319 17:04:25.742958 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e695151-c2fe-4d41-9150-f10045d9dad1","Type":"ContainerDied","Data":"13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6"} Mar 19 17:04:25 crc kubenswrapper[4918]: I0319 17:04:25.743053 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e695151-c2fe-4d41-9150-f10045d9dad1","Type":"ContainerDied","Data":"9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f"} Mar 19 17:04:25 crc kubenswrapper[4918]: E0319 17:04:25.933707 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e695151_c2fe_4d41_9150_f10045d9dad1.slice/crio-conmon-13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e695151_c2fe_4d41_9150_f10045d9dad1.slice/crio-13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e695151_c2fe_4d41_9150_f10045d9dad1.slice/crio-conmon-f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b.scope\": RecentStats: unable to find data in memory cache]" Mar 19 17:04:26 crc kubenswrapper[4918]: I0319 17:04:26.761611 4918 generic.go:334] "Generic (PLEG): container finished" podID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerID="f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b" exitCode=0 Mar 19 17:04:26 crc kubenswrapper[4918]: I0319 17:04:26.762328 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e695151-c2fe-4d41-9150-f10045d9dad1","Type":"ContainerDied","Data":"f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b"} Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.748461 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.787140 4918 generic.go:334] "Generic (PLEG): container finished" podID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerID="837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32" exitCode=0 Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.787179 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e695151-c2fe-4d41-9150-f10045d9dad1","Type":"ContainerDied","Data":"837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32"} Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.787201 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4e695151-c2fe-4d41-9150-f10045d9dad1","Type":"ContainerDied","Data":"4455aeee863739c688f670726638409bdfe23d3ae4ed79b19e496e5249f4a7b4"} Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.787217 4918 scope.go:117] "RemoveContainer" containerID="13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.787342 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.823304 4918 scope.go:117] "RemoveContainer" containerID="9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.852192 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-config-data\") pod \"4e695151-c2fe-4d41-9150-f10045d9dad1\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.852327 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7sbl\" (UniqueName: \"kubernetes.io/projected/4e695151-c2fe-4d41-9150-f10045d9dad1-kube-api-access-r7sbl\") pod \"4e695151-c2fe-4d41-9150-f10045d9dad1\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.852399 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-ceilometer-tls-certs\") pod \"4e695151-c2fe-4d41-9150-f10045d9dad1\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.852433 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-combined-ca-bundle\") pod \"4e695151-c2fe-4d41-9150-f10045d9dad1\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.852501 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-scripts\") pod \"4e695151-c2fe-4d41-9150-f10045d9dad1\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.852656 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-run-httpd\") pod \"4e695151-c2fe-4d41-9150-f10045d9dad1\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.852741 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-sg-core-conf-yaml\") pod \"4e695151-c2fe-4d41-9150-f10045d9dad1\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.852772 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-log-httpd\") pod \"4e695151-c2fe-4d41-9150-f10045d9dad1\" (UID: \"4e695151-c2fe-4d41-9150-f10045d9dad1\") " Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.854854 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4e695151-c2fe-4d41-9150-f10045d9dad1" (UID: "4e695151-c2fe-4d41-9150-f10045d9dad1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.856021 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4e695151-c2fe-4d41-9150-f10045d9dad1" (UID: "4e695151-c2fe-4d41-9150-f10045d9dad1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.863462 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-scripts" (OuterVolumeSpecName: "scripts") pod "4e695151-c2fe-4d41-9150-f10045d9dad1" (UID: "4e695151-c2fe-4d41-9150-f10045d9dad1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.867040 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e695151-c2fe-4d41-9150-f10045d9dad1-kube-api-access-r7sbl" (OuterVolumeSpecName: "kube-api-access-r7sbl") pod "4e695151-c2fe-4d41-9150-f10045d9dad1" (UID: "4e695151-c2fe-4d41-9150-f10045d9dad1"). InnerVolumeSpecName "kube-api-access-r7sbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.876911 4918 scope.go:117] "RemoveContainer" containerID="837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.887862 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="049bc86c-2172-4f37-b7b4-20e546c273e4" containerName="rabbitmq" containerID="cri-o://1590419da13d0aa0b5f985aa7ddf4cf89cfa843d2aa16978885275772527f724" gracePeriod=604795 Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.913925 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4e695151-c2fe-4d41-9150-f10045d9dad1" (UID: "4e695151-c2fe-4d41-9150-f10045d9dad1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.956690 4918 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.956972 4918 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.957054 4918 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4e695151-c2fe-4d41-9150-f10045d9dad1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.957147 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7sbl\" (UniqueName: \"kubernetes.io/projected/4e695151-c2fe-4d41-9150-f10045d9dad1-kube-api-access-r7sbl\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:28 crc kubenswrapper[4918]: I0319 17:04:28.957221 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.000174 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e695151-c2fe-4d41-9150-f10045d9dad1" (UID: "4e695151-c2fe-4d41-9150-f10045d9dad1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.012245 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4e695151-c2fe-4d41-9150-f10045d9dad1" (UID: "4e695151-c2fe-4d41-9150-f10045d9dad1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.059050 4918 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.059088 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.059942 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-config-data" (OuterVolumeSpecName: "config-data") pod "4e695151-c2fe-4d41-9150-f10045d9dad1" (UID: "4e695151-c2fe-4d41-9150-f10045d9dad1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.146227 4918 scope.go:117] "RemoveContainer" containerID="f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.160073 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e695151-c2fe-4d41-9150-f10045d9dad1-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.161768 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.170798 4918 scope.go:117] "RemoveContainer" containerID="13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6" Mar 19 17:04:29 crc kubenswrapper[4918]: E0319 17:04:29.171879 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6\": container with ID starting with 13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6 not found: ID does not exist" containerID="13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.171911 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6"} err="failed to get container status \"13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6\": rpc error: code = NotFound desc = could not find container \"13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6\": container with ID starting with 13802e6e1b3dd901b9b43dce2fe2c79fdcf8e191b786d9bd92e7fbcdb3a05ed6 not found: ID does not exist" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.171933 4918 scope.go:117] "RemoveContainer" containerID="9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f" Mar 19 17:04:29 crc kubenswrapper[4918]: E0319 17:04:29.173639 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f\": container with ID starting with 9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f not found: ID does not exist" containerID="9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.173662 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f"} err="failed to get container status \"9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f\": rpc error: code = NotFound desc = could not find container \"9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f\": container with ID starting with 9f50579f6c2ef75f8ddaf4528701fa21415fd97ac36135bfcc0318a7649be29f not found: ID does not exist" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.173678 4918 scope.go:117] "RemoveContainer" containerID="837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32" Mar 19 17:04:29 crc kubenswrapper[4918]: E0319 17:04:29.175912 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32\": container with ID starting with 837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32 not found: ID does not exist" containerID="837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.175936 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32"} err="failed to get container status \"837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32\": rpc error: code = NotFound desc = could not find container \"837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32\": container with ID starting with 837bc1327efc27c5d4d9f074535687a91b04ea85ce9955958c2e28215a465d32 not found: ID does not exist" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.175951 4918 scope.go:117] "RemoveContainer" containerID="f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.176456 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:04:29 crc kubenswrapper[4918]: E0319 17:04:29.176491 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b\": container with ID starting with f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b not found: ID does not exist" containerID="f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.176508 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b"} err="failed to get container status \"f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b\": rpc error: code = NotFound desc = could not find container \"f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b\": container with ID starting with f4b7be49033d5a557f443f579013fb1218dfd44d172fbdc7a61c04810e4d061b not found: ID does not exist" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.202051 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:04:29 crc kubenswrapper[4918]: E0319 17:04:29.202488 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="sg-core" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.202506 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="sg-core" Mar 19 17:04:29 crc kubenswrapper[4918]: E0319 17:04:29.202543 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="proxy-httpd" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.202549 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="proxy-httpd" Mar 19 17:04:29 crc kubenswrapper[4918]: E0319 17:04:29.202575 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="ceilometer-notification-agent" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.202581 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="ceilometer-notification-agent" Mar 19 17:04:29 crc kubenswrapper[4918]: E0319 17:04:29.202598 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="ceilometer-central-agent" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.202603 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="ceilometer-central-agent" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.202782 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="sg-core" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.202801 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="proxy-httpd" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.202813 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="ceilometer-central-agent" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.202825 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" containerName="ceilometer-notification-agent" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.204623 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.210163 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.210338 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.210441 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.221461 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.366195 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvbhf\" (UniqueName: \"kubernetes.io/projected/b0189fc1-60b5-4734-a4b2-aa1714795f50-kube-api-access-wvbhf\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.366266 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.366427 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.366467 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.366535 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-scripts\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.366571 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0189fc1-60b5-4734-a4b2-aa1714795f50-run-httpd\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.366612 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0189fc1-60b5-4734-a4b2-aa1714795f50-log-httpd\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.366644 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-config-data\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.468135 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.468186 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.468224 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-scripts\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.468250 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0189fc1-60b5-4734-a4b2-aa1714795f50-run-httpd\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.468276 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0189fc1-60b5-4734-a4b2-aa1714795f50-log-httpd\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.468296 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-config-data\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.468325 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvbhf\" (UniqueName: \"kubernetes.io/projected/b0189fc1-60b5-4734-a4b2-aa1714795f50-kube-api-access-wvbhf\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.468343 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.469748 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0189fc1-60b5-4734-a4b2-aa1714795f50-run-httpd\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.470296 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0189fc1-60b5-4734-a4b2-aa1714795f50-log-httpd\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.475283 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.475462 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.475642 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-config-data\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.476146 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-scripts\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.476170 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0189fc1-60b5-4734-a4b2-aa1714795f50-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.495078 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvbhf\" (UniqueName: \"kubernetes.io/projected/b0189fc1-60b5-4734-a4b2-aa1714795f50-kube-api-access-wvbhf\") pod \"ceilometer-0\" (UID: \"b0189fc1-60b5-4734-a4b2-aa1714795f50\") " pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.525562 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 17:04:29 crc kubenswrapper[4918]: I0319 17:04:29.833676 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="849ee593-de3d-4343-8a63-3ca581fbbaaf" containerName="rabbitmq" containerID="cri-o://8991ef3b76ab8f08cdc76be810108c9a6ee39bdf4e6ad0cb1d96a3c67a5c362a" gracePeriod=604795 Mar 19 17:04:30 crc kubenswrapper[4918]: I0319 17:04:30.154196 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 17:04:30 crc kubenswrapper[4918]: I0319 17:04:30.601735 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e695151-c2fe-4d41-9150-f10045d9dad1" path="/var/lib/kubelet/pods/4e695151-c2fe-4d41-9150-f10045d9dad1/volumes" Mar 19 17:04:30 crc kubenswrapper[4918]: I0319 17:04:30.823279 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0189fc1-60b5-4734-a4b2-aa1714795f50","Type":"ContainerStarted","Data":"f2183465dc69541573c07b61cf0e9b7029436816de69af3d72409e777bac5a97"} Mar 19 17:04:35 crc kubenswrapper[4918]: I0319 17:04:35.914166 4918 generic.go:334] "Generic (PLEG): container finished" podID="049bc86c-2172-4f37-b7b4-20e546c273e4" containerID="1590419da13d0aa0b5f985aa7ddf4cf89cfa843d2aa16978885275772527f724" exitCode=0 Mar 19 17:04:35 crc kubenswrapper[4918]: I0319 17:04:35.914241 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"049bc86c-2172-4f37-b7b4-20e546c273e4","Type":"ContainerDied","Data":"1590419da13d0aa0b5f985aa7ddf4cf89cfa843d2aa16978885275772527f724"} Mar 19 17:04:36 crc kubenswrapper[4918]: E0319 17:04:36.228227 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod849ee593_de3d_4343_8a63_3ca581fbbaaf.slice/crio-8991ef3b76ab8f08cdc76be810108c9a6ee39bdf4e6ad0cb1d96a3c67a5c362a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod849ee593_de3d_4343_8a63_3ca581fbbaaf.slice/crio-conmon-8991ef3b76ab8f08cdc76be810108c9a6ee39bdf4e6ad0cb1d96a3c67a5c362a.scope\": RecentStats: unable to find data in memory cache]" Mar 19 17:04:36 crc kubenswrapper[4918]: I0319 17:04:36.930508 4918 generic.go:334] "Generic (PLEG): container finished" podID="849ee593-de3d-4343-8a63-3ca581fbbaaf" containerID="8991ef3b76ab8f08cdc76be810108c9a6ee39bdf4e6ad0cb1d96a3c67a5c362a" exitCode=0 Mar 19 17:04:36 crc kubenswrapper[4918]: I0319 17:04:36.930567 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"849ee593-de3d-4343-8a63-3ca581fbbaaf","Type":"ContainerDied","Data":"8991ef3b76ab8f08cdc76be810108c9a6ee39bdf4e6ad0cb1d96a3c67a5c362a"} Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.683645 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-f2th5"] Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.685608 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.688620 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.706176 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-f2th5"] Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.754982 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="849ee593-de3d-4343-8a63-3ca581fbbaaf" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.760235 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-config\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.760313 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.760431 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.760465 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.760489 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.760547 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.760595 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqjz4\" (UniqueName: \"kubernetes.io/projected/7486913b-4d6b-4dde-804d-3525ac608497-kube-api-access-pqjz4\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.862238 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.862319 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.862341 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.862378 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.862427 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqjz4\" (UniqueName: \"kubernetes.io/projected/7486913b-4d6b-4dde-804d-3525ac608497-kube-api-access-pqjz4\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.862491 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-config\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.862542 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.863746 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.864410 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.865185 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.865809 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.866292 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.867151 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-config\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:37 crc kubenswrapper[4918]: I0319 17:04:37.887786 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqjz4\" (UniqueName: \"kubernetes.io/projected/7486913b-4d6b-4dde-804d-3525ac608497-kube-api-access-pqjz4\") pod \"dnsmasq-dns-dbb88bf8c-f2th5\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:38 crc kubenswrapper[4918]: I0319 17:04:38.016536 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:38 crc kubenswrapper[4918]: I0319 17:04:38.101122 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="049bc86c-2172-4f37-b7b4-20e546c273e4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.111:5671: connect: connection refused" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.644151 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.654287 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.731443 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/049bc86c-2172-4f37-b7b4-20e546c273e4-erlang-cookie-secret\") pod \"049bc86c-2172-4f37-b7b4-20e546c273e4\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.731827 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-tls\") pod \"049bc86c-2172-4f37-b7b4-20e546c273e4\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.731854 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-config-data\") pod \"049bc86c-2172-4f37-b7b4-20e546c273e4\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.731907 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-confd\") pod \"849ee593-de3d-4343-8a63-3ca581fbbaaf\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.731941 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-erlang-cookie\") pod \"049bc86c-2172-4f37-b7b4-20e546c273e4\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.731989 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/849ee593-de3d-4343-8a63-3ca581fbbaaf-pod-info\") pod \"849ee593-de3d-4343-8a63-3ca581fbbaaf\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.732012 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-config-data\") pod \"849ee593-de3d-4343-8a63-3ca581fbbaaf\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.732045 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-erlang-cookie\") pod \"849ee593-de3d-4343-8a63-3ca581fbbaaf\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.732116 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-plugins-conf\") pod \"049bc86c-2172-4f37-b7b4-20e546c273e4\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.732142 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/849ee593-de3d-4343-8a63-3ca581fbbaaf-erlang-cookie-secret\") pod \"849ee593-de3d-4343-8a63-3ca581fbbaaf\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.733398 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\") pod \"849ee593-de3d-4343-8a63-3ca581fbbaaf\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.733437 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-plugins\") pod \"049bc86c-2172-4f37-b7b4-20e546c273e4\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.737646 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82\") pod \"049bc86c-2172-4f37-b7b4-20e546c273e4\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.737711 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-confd\") pod \"049bc86c-2172-4f37-b7b4-20e546c273e4\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.737751 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-server-conf\") pod \"049bc86c-2172-4f37-b7b4-20e546c273e4\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.737794 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm6q2\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-kube-api-access-mm6q2\") pod \"849ee593-de3d-4343-8a63-3ca581fbbaaf\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.737845 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-plugins\") pod \"849ee593-de3d-4343-8a63-3ca581fbbaaf\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.737870 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-plugins-conf\") pod \"849ee593-de3d-4343-8a63-3ca581fbbaaf\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.737914 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-server-conf\") pod \"849ee593-de3d-4343-8a63-3ca581fbbaaf\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.737958 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-tls\") pod \"849ee593-de3d-4343-8a63-3ca581fbbaaf\" (UID: \"849ee593-de3d-4343-8a63-3ca581fbbaaf\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.738042 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/049bc86c-2172-4f37-b7b4-20e546c273e4-pod-info\") pod \"049bc86c-2172-4f37-b7b4-20e546c273e4\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.738101 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs65s\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-kube-api-access-gs65s\") pod \"049bc86c-2172-4f37-b7b4-20e546c273e4\" (UID: \"049bc86c-2172-4f37-b7b4-20e546c273e4\") " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.749736 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "849ee593-de3d-4343-8a63-3ca581fbbaaf" (UID: "849ee593-de3d-4343-8a63-3ca581fbbaaf"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.749757 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "049bc86c-2172-4f37-b7b4-20e546c273e4" (UID: "049bc86c-2172-4f37-b7b4-20e546c273e4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.756837 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "049bc86c-2172-4f37-b7b4-20e546c273e4" (UID: "049bc86c-2172-4f37-b7b4-20e546c273e4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.759752 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "849ee593-de3d-4343-8a63-3ca581fbbaaf" (UID: "849ee593-de3d-4343-8a63-3ca581fbbaaf"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.763735 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "049bc86c-2172-4f37-b7b4-20e546c273e4" (UID: "049bc86c-2172-4f37-b7b4-20e546c273e4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.765653 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-kube-api-access-mm6q2" (OuterVolumeSpecName: "kube-api-access-mm6q2") pod "849ee593-de3d-4343-8a63-3ca581fbbaaf" (UID: "849ee593-de3d-4343-8a63-3ca581fbbaaf"). InnerVolumeSpecName "kube-api-access-mm6q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.768361 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "049bc86c-2172-4f37-b7b4-20e546c273e4" (UID: "049bc86c-2172-4f37-b7b4-20e546c273e4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.772550 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849ee593-de3d-4343-8a63-3ca581fbbaaf-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "849ee593-de3d-4343-8a63-3ca581fbbaaf" (UID: "849ee593-de3d-4343-8a63-3ca581fbbaaf"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.772718 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "849ee593-de3d-4343-8a63-3ca581fbbaaf" (UID: "849ee593-de3d-4343-8a63-3ca581fbbaaf"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.780756 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049bc86c-2172-4f37-b7b4-20e546c273e4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "049bc86c-2172-4f37-b7b4-20e546c273e4" (UID: "049bc86c-2172-4f37-b7b4-20e546c273e4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.786848 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-kube-api-access-gs65s" (OuterVolumeSpecName: "kube-api-access-gs65s") pod "049bc86c-2172-4f37-b7b4-20e546c273e4" (UID: "049bc86c-2172-4f37-b7b4-20e546c273e4"). InnerVolumeSpecName "kube-api-access-gs65s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.792621 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/049bc86c-2172-4f37-b7b4-20e546c273e4-pod-info" (OuterVolumeSpecName: "pod-info") pod "049bc86c-2172-4f37-b7b4-20e546c273e4" (UID: "049bc86c-2172-4f37-b7b4-20e546c273e4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.792931 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "849ee593-de3d-4343-8a63-3ca581fbbaaf" (UID: "849ee593-de3d-4343-8a63-3ca581fbbaaf"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.803369 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/849ee593-de3d-4343-8a63-3ca581fbbaaf-pod-info" (OuterVolumeSpecName: "pod-info") pod "849ee593-de3d-4343-8a63-3ca581fbbaaf" (UID: "849ee593-de3d-4343-8a63-3ca581fbbaaf"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841533 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm6q2\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-kube-api-access-mm6q2\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841575 4918 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841587 4918 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841598 4918 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841622 4918 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/049bc86c-2172-4f37-b7b4-20e546c273e4-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841637 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs65s\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-kube-api-access-gs65s\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841646 4918 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/049bc86c-2172-4f37-b7b4-20e546c273e4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841654 4918 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841662 4918 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841670 4918 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/849ee593-de3d-4343-8a63-3ca581fbbaaf-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841680 4918 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841688 4918 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841697 4918 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/849ee593-de3d-4343-8a63-3ca581fbbaaf-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.841705 4918 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.847029 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82" (OuterVolumeSpecName: "persistence") pod "049bc86c-2172-4f37-b7b4-20e546c273e4" (UID: "049bc86c-2172-4f37-b7b4-20e546c273e4"). InnerVolumeSpecName "pvc-e42ce485-e6ce-4799-b932-b106c6280e82". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.864903 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a" (OuterVolumeSpecName: "persistence") pod "849ee593-de3d-4343-8a63-3ca581fbbaaf" (UID: "849ee593-de3d-4343-8a63-3ca581fbbaaf"). InnerVolumeSpecName "pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.906347 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-config-data" (OuterVolumeSpecName: "config-data") pod "849ee593-de3d-4343-8a63-3ca581fbbaaf" (UID: "849ee593-de3d-4343-8a63-3ca581fbbaaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.954172 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.954236 4918 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\") on node \"crc\" " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.954259 4918 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e42ce485-e6ce-4799-b932-b106c6280e82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82\") on node \"crc\" " Mar 19 17:04:44 crc kubenswrapper[4918]: I0319 17:04:44.968579 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-server-conf" (OuterVolumeSpecName: "server-conf") pod "849ee593-de3d-4343-8a63-3ca581fbbaaf" (UID: "849ee593-de3d-4343-8a63-3ca581fbbaaf"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:44.998941 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-config-data" (OuterVolumeSpecName: "config-data") pod "049bc86c-2172-4f37-b7b4-20e546c273e4" (UID: "049bc86c-2172-4f37-b7b4-20e546c273e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.033804 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-server-conf" (OuterVolumeSpecName: "server-conf") pod "049bc86c-2172-4f37-b7b4-20e546c273e4" (UID: "049bc86c-2172-4f37-b7b4-20e546c273e4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.060077 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"849ee593-de3d-4343-8a63-3ca581fbbaaf","Type":"ContainerDied","Data":"01912381b0f1752f47c92a24234abee24d01d24cce9470553e9a3a83ea942d22"} Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.060148 4918 scope.go:117] "RemoveContainer" containerID="8991ef3b76ab8f08cdc76be810108c9a6ee39bdf4e6ad0cb1d96a3c67a5c362a" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.060405 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.071433 4918 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.071891 4918 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/849ee593-de3d-4343-8a63-3ca581fbbaaf-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.071907 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/049bc86c-2172-4f37-b7b4-20e546c273e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.078089 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"049bc86c-2172-4f37-b7b4-20e546c273e4","Type":"ContainerDied","Data":"fba516ba3b8734de1a75f21837a3b329a1318b670440ca5f7bff899f1e78de3c"} Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.078253 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.132959 4918 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.133119 4918 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a") on node "crc" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.159059 4918 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.159189 4918 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e42ce485-e6ce-4799-b932-b106c6280e82" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82") on node "crc" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.183353 4918 reconciler_common.go:293] "Volume detached for volume \"pvc-e42ce485-e6ce-4799-b932-b106c6280e82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.183385 4918 reconciler_common.go:293] "Volume detached for volume \"pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.185300 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "849ee593-de3d-4343-8a63-3ca581fbbaaf" (UID: "849ee593-de3d-4343-8a63-3ca581fbbaaf"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.278316 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "049bc86c-2172-4f37-b7b4-20e546c273e4" (UID: "049bc86c-2172-4f37-b7b4-20e546c273e4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.285592 4918 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/849ee593-de3d-4343-8a63-3ca581fbbaaf-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.285627 4918 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/049bc86c-2172-4f37-b7b4-20e546c273e4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.416958 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.441594 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.454770 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.470601 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.485891 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 17:04:45 crc kubenswrapper[4918]: E0319 17:04:45.486338 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049bc86c-2172-4f37-b7b4-20e546c273e4" containerName="setup-container" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.486356 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="049bc86c-2172-4f37-b7b4-20e546c273e4" containerName="setup-container" Mar 19 17:04:45 crc kubenswrapper[4918]: E0319 17:04:45.486387 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049bc86c-2172-4f37-b7b4-20e546c273e4" containerName="rabbitmq" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.486394 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="049bc86c-2172-4f37-b7b4-20e546c273e4" containerName="rabbitmq" Mar 19 17:04:45 crc kubenswrapper[4918]: E0319 17:04:45.486406 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849ee593-de3d-4343-8a63-3ca581fbbaaf" containerName="rabbitmq" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.486411 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="849ee593-de3d-4343-8a63-3ca581fbbaaf" containerName="rabbitmq" Mar 19 17:04:45 crc kubenswrapper[4918]: E0319 17:04:45.486421 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849ee593-de3d-4343-8a63-3ca581fbbaaf" containerName="setup-container" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.486427 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="849ee593-de3d-4343-8a63-3ca581fbbaaf" containerName="setup-container" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.486651 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="849ee593-de3d-4343-8a63-3ca581fbbaaf" containerName="rabbitmq" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.486666 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="049bc86c-2172-4f37-b7b4-20e546c273e4" containerName="rabbitmq" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.487818 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.492787 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.494601 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.504955 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.506828 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.525996 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.526252 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.526473 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.526634 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.526738 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-xdp5h" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.527113 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.527284 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.529212 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.536581 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.541856 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.552654 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-sc5lh" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.552830 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.552958 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.561313 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.590772 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.590815 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.590843 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.590874 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.590892 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/025d722c-5115-4aae-bebd-3942f7da690d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.590913 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e42ce485-e6ce-4799-b932-b106c6280e82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.590929 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/025d722c-5115-4aae-bebd-3942f7da690d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.590949 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.590966 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ss6w\" (UniqueName: \"kubernetes.io/projected/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-kube-api-access-7ss6w\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.590983 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/025d722c-5115-4aae-bebd-3942f7da690d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591002 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591034 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5fsz\" (UniqueName: \"kubernetes.io/projected/025d722c-5115-4aae-bebd-3942f7da690d-kube-api-access-w5fsz\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591051 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591080 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-config-data\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591094 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591129 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591151 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591183 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/025d722c-5115-4aae-bebd-3942f7da690d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591198 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/025d722c-5115-4aae-bebd-3942f7da690d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591222 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591245 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.591262 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695049 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695111 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695133 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/025d722c-5115-4aae-bebd-3942f7da690d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695159 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e42ce485-e6ce-4799-b932-b106c6280e82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695180 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/025d722c-5115-4aae-bebd-3942f7da690d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695202 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695220 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ss6w\" (UniqueName: \"kubernetes.io/projected/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-kube-api-access-7ss6w\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695242 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/025d722c-5115-4aae-bebd-3942f7da690d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695269 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695310 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5fsz\" (UniqueName: \"kubernetes.io/projected/025d722c-5115-4aae-bebd-3942f7da690d-kube-api-access-w5fsz\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695336 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695379 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-config-data\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695402 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695457 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695485 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695542 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/025d722c-5115-4aae-bebd-3942f7da690d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695558 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/025d722c-5115-4aae-bebd-3942f7da690d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695586 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695614 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695637 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695672 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.695694 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.697338 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.702395 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.704412 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/025d722c-5115-4aae-bebd-3942f7da690d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.704673 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.705203 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/025d722c-5115-4aae-bebd-3942f7da690d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.705333 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-config-data\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.705553 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.706979 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.708753 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.708796 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f0fb5bb69065b19e05ed9740b98b0aaf97ec80d366f84bfe0013a4f3714e457d/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.708811 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.709430 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/025d722c-5115-4aae-bebd-3942f7da690d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.709866 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.716173 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.724196 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.724894 4918 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.724923 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e42ce485-e6ce-4799-b932-b106c6280e82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a3db0c9d310d3a68815aa4d626186578650c2baeddb5f523145e0eb7b8c277ef/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.729217 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.730189 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.737121 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/025d722c-5115-4aae-bebd-3942f7da690d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.744412 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ss6w\" (UniqueName: \"kubernetes.io/projected/5cf3eb1c-8f65-4460-8283-dcdbe5d51e50-kube-api-access-7ss6w\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.745473 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/025d722c-5115-4aae-bebd-3942f7da690d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.749734 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/025d722c-5115-4aae-bebd-3942f7da690d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.753556 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5fsz\" (UniqueName: \"kubernetes.io/projected/025d722c-5115-4aae-bebd-3942f7da690d-kube-api-access-w5fsz\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.942859 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e42ce485-e6ce-4799-b932-b106c6280e82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e42ce485-e6ce-4799-b932-b106c6280e82\") pod \"rabbitmq-server-0\" (UID: \"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50\") " pod="openstack/rabbitmq-server-0" Mar 19 17:04:45 crc kubenswrapper[4918]: I0319 17:04:45.983433 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4d52560-ac12-4c23-8fd0-53b7abe5035a\") pod \"rabbitmq-cell1-server-0\" (UID: \"025d722c-5115-4aae-bebd-3942f7da690d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:46 crc kubenswrapper[4918]: I0319 17:04:46.193261 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:04:46 crc kubenswrapper[4918]: I0319 17:04:46.213925 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 17:04:46 crc kubenswrapper[4918]: I0319 17:04:46.600890 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049bc86c-2172-4f37-b7b4-20e546c273e4" path="/var/lib/kubelet/pods/049bc86c-2172-4f37-b7b4-20e546c273e4/volumes" Mar 19 17:04:46 crc kubenswrapper[4918]: I0319 17:04:46.602549 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849ee593-de3d-4343-8a63-3ca581fbbaaf" path="/var/lib/kubelet/pods/849ee593-de3d-4343-8a63-3ca581fbbaaf/volumes" Mar 19 17:04:53 crc kubenswrapper[4918]: E0319 17:04:53.657870 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Mar 19 17:04:53 crc kubenswrapper[4918]: E0319 17:04:53.658440 4918 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested" Mar 19 17:04:53 crc kubenswrapper[4918]: E0319 17:04:53.658606 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8fj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-cm55x_openstack(79829597-ef66-4d6f-946f-adaa9ec3d227): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 17:04:53 crc kubenswrapper[4918]: E0319 17:04:53.659798 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-cm55x" podUID="79829597-ef66-4d6f-946f-adaa9ec3d227" Mar 19 17:04:53 crc kubenswrapper[4918]: I0319 17:04:53.978981 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-f2th5"] Mar 19 17:04:54 crc kubenswrapper[4918]: E0319 17:04:54.219689 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current-tested\\\"\"" pod="openstack/cloudkitty-db-sync-cm55x" podUID="79829597-ef66-4d6f-946f-adaa9ec3d227" Mar 19 17:04:54 crc kubenswrapper[4918]: I0319 17:04:54.373110 4918 scope.go:117] "RemoveContainer" containerID="8dcfdc5ec03c9e57cd90b1af5c63e724835e77f75667c64ea36e6d4de0de6025" Mar 19 17:04:54 crc kubenswrapper[4918]: I0319 17:04:54.423540 4918 scope.go:117] "RemoveContainer" containerID="1590419da13d0aa0b5f985aa7ddf4cf89cfa843d2aa16978885275772527f724" Mar 19 17:04:54 crc kubenswrapper[4918]: E0319 17:04:54.571622 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 19 17:04:54 crc kubenswrapper[4918]: E0319 17:04:54.571675 4918 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 19 17:04:54 crc kubenswrapper[4918]: E0319 17:04:54.571785 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nd5h646h76h86h589h689h79h687h5h56fh67bh7bh86h88h66fh5fh598hd7h76h5d4hcch54fhcbh5cbh679h4h65h649h5bch594h595hf5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wvbhf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(b0189fc1-60b5-4734-a4b2-aa1714795f50): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 17:04:54 crc kubenswrapper[4918]: I0319 17:04:54.617339 4918 scope.go:117] "RemoveContainer" containerID="3def5dec798f2b559b2334d375c417119107e526427d467805fc3b41126d7aca" Mar 19 17:04:55 crc kubenswrapper[4918]: I0319 17:04:55.200586 4918 generic.go:334] "Generic (PLEG): container finished" podID="7486913b-4d6b-4dde-804d-3525ac608497" containerID="65fa08bc2e1272564937199067d76a8b221f320c1fb84bcd2c542128f815c9b7" exitCode=0 Mar 19 17:04:55 crc kubenswrapper[4918]: I0319 17:04:55.200722 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" event={"ID":"7486913b-4d6b-4dde-804d-3525ac608497","Type":"ContainerDied","Data":"65fa08bc2e1272564937199067d76a8b221f320c1fb84bcd2c542128f815c9b7"} Mar 19 17:04:55 crc kubenswrapper[4918]: I0319 17:04:55.200871 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" event={"ID":"7486913b-4d6b-4dde-804d-3525ac608497","Type":"ContainerStarted","Data":"b87f7a89125188229b0311582fb536f0b51105a75e59cb7d1b09ea76b6fb7a53"} Mar 19 17:04:55 crc kubenswrapper[4918]: W0319 17:04:55.281720 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod025d722c_5115_4aae_bebd_3942f7da690d.slice/crio-4de6492e5651a3ce0153e34aa91786553ff13355df9e7d4d2c150c817a5c35de WatchSource:0}: Error finding container 4de6492e5651a3ce0153e34aa91786553ff13355df9e7d4d2c150c817a5c35de: Status 404 returned error can't find the container with id 4de6492e5651a3ce0153e34aa91786553ff13355df9e7d4d2c150c817a5c35de Mar 19 17:04:55 crc kubenswrapper[4918]: I0319 17:04:55.293583 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 17:04:55 crc kubenswrapper[4918]: I0319 17:04:55.303551 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 17:04:56 crc kubenswrapper[4918]: I0319 17:04:56.211508 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50","Type":"ContainerStarted","Data":"4f6cb7761601017be65cfbaa2f3cb17347c36ea6bd59e8d19ff2b2b0284e0936"} Mar 19 17:04:56 crc kubenswrapper[4918]: I0319 17:04:56.212809 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"025d722c-5115-4aae-bebd-3942f7da690d","Type":"ContainerStarted","Data":"4de6492e5651a3ce0153e34aa91786553ff13355df9e7d4d2c150c817a5c35de"} Mar 19 17:04:56 crc kubenswrapper[4918]: I0319 17:04:56.215573 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" event={"ID":"7486913b-4d6b-4dde-804d-3525ac608497","Type":"ContainerStarted","Data":"3651a39c41b2c8c62e212aebe04b488427a81062da2ef38daeeb117fcec56bc8"} Mar 19 17:04:56 crc kubenswrapper[4918]: I0319 17:04:56.215776 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:04:56 crc kubenswrapper[4918]: I0319 17:04:56.241663 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" podStartSLOduration=19.241642686 podStartE2EDuration="19.241642686s" podCreationTimestamp="2026-03-19 17:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:04:56.240909546 +0000 UTC m=+1508.363108804" watchObservedRunningTime="2026-03-19 17:04:56.241642686 +0000 UTC m=+1508.363841934" Mar 19 17:04:57 crc kubenswrapper[4918]: I0319 17:04:57.227356 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"025d722c-5115-4aae-bebd-3942f7da690d","Type":"ContainerStarted","Data":"9076b62cf6edd0836c937d2b17c97df9da431a990feced676f64b6936e0e24c6"} Mar 19 17:04:57 crc kubenswrapper[4918]: I0319 17:04:57.229396 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0189fc1-60b5-4734-a4b2-aa1714795f50","Type":"ContainerStarted","Data":"5b8906f514e496a54cce7a029c9e981e35941a87d4a095462a541978a1982a7d"} Mar 19 17:04:57 crc kubenswrapper[4918]: I0319 17:04:57.231056 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50","Type":"ContainerStarted","Data":"8a76235a4c8dc28d930bbde142db1c543124492c040b5c741caddbd756dcf162"} Mar 19 17:04:58 crc kubenswrapper[4918]: I0319 17:04:58.212419 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:04:58 crc kubenswrapper[4918]: I0319 17:04:58.212802 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:04:58 crc kubenswrapper[4918]: I0319 17:04:58.244295 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0189fc1-60b5-4734-a4b2-aa1714795f50","Type":"ContainerStarted","Data":"e7cba4e3ca1d04f2778004330eec3f0c37c8c066d8efadf5eb854e2c51bcba6c"} Mar 19 17:05:01 crc kubenswrapper[4918]: I0319 17:05:01.271221 4918 scope.go:117] "RemoveContainer" containerID="c0dcd069a9d5237045e8b71001f7a58ce8822b0ab47acff231f1095eab3c6168" Mar 19 17:05:02 crc kubenswrapper[4918]: E0319 17:05:02.065411 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="b0189fc1-60b5-4734-a4b2-aa1714795f50" Mar 19 17:05:02 crc kubenswrapper[4918]: I0319 17:05:02.720549 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0189fc1-60b5-4734-a4b2-aa1714795f50","Type":"ContainerStarted","Data":"c409f1a1a68aee58196fc0c1bc6e4250150bdfc0c441d0fbab5d79a4fc4a2b49"} Mar 19 17:05:02 crc kubenswrapper[4918]: I0319 17:05:02.720686 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 17:05:02 crc kubenswrapper[4918]: E0319 17:05:02.722063 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="b0189fc1-60b5-4734-a4b2-aa1714795f50" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.018878 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.080767 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8xw6t"] Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.081010 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" podUID="bb03378b-77b3-44c8-97eb-558868797b23" containerName="dnsmasq-dns" containerID="cri-o://1416871aa3f03c85b78ec34a93f66e562966e8e95eb12cd7a99f7c5d4afb93dc" gracePeriod=10 Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.289041 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-ktkwd"] Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.291617 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.319715 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-ktkwd"] Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.421152 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.421229 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-dns-svc\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.421249 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5knh6\" (UniqueName: \"kubernetes.io/projected/d431095e-2467-47c8-8288-ef25e31bed1e-kube-api-access-5knh6\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.421271 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.421311 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-config\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.421444 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.421472 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.523878 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.523986 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-dns-svc\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.524014 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5knh6\" (UniqueName: \"kubernetes.io/projected/d431095e-2467-47c8-8288-ef25e31bed1e-kube-api-access-5knh6\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.524059 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.524117 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-config\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.524337 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.524388 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.524821 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.524831 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.525190 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-dns-svc\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.525224 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.525234 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-config\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.525248 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d431095e-2467-47c8-8288-ef25e31bed1e-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.549222 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5knh6\" (UniqueName: \"kubernetes.io/projected/d431095e-2467-47c8-8288-ef25e31bed1e-kube-api-access-5knh6\") pod \"dnsmasq-dns-85f64749dc-ktkwd\" (UID: \"d431095e-2467-47c8-8288-ef25e31bed1e\") " pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.627407 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.816062 4918 generic.go:334] "Generic (PLEG): container finished" podID="bb03378b-77b3-44c8-97eb-558868797b23" containerID="1416871aa3f03c85b78ec34a93f66e562966e8e95eb12cd7a99f7c5d4afb93dc" exitCode=0 Mar 19 17:05:03 crc kubenswrapper[4918]: I0319 17:05:03.817176 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" event={"ID":"bb03378b-77b3-44c8-97eb-558868797b23","Type":"ContainerDied","Data":"1416871aa3f03c85b78ec34a93f66e562966e8e95eb12cd7a99f7c5d4afb93dc"} Mar 19 17:05:03 crc kubenswrapper[4918]: E0319 17:05:03.824761 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="b0189fc1-60b5-4734-a4b2-aa1714795f50" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.108917 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.242439 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-ktkwd"] Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.253607 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-sb\") pod \"bb03378b-77b3-44c8-97eb-558868797b23\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.253695 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-swift-storage-0\") pod \"bb03378b-77b3-44c8-97eb-558868797b23\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.253810 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-nb\") pod \"bb03378b-77b3-44c8-97eb-558868797b23\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.253925 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftmdv\" (UniqueName: \"kubernetes.io/projected/bb03378b-77b3-44c8-97eb-558868797b23-kube-api-access-ftmdv\") pod \"bb03378b-77b3-44c8-97eb-558868797b23\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.253983 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-config\") pod \"bb03378b-77b3-44c8-97eb-558868797b23\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.254028 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-svc\") pod \"bb03378b-77b3-44c8-97eb-558868797b23\" (UID: \"bb03378b-77b3-44c8-97eb-558868797b23\") " Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.267450 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb03378b-77b3-44c8-97eb-558868797b23-kube-api-access-ftmdv" (OuterVolumeSpecName: "kube-api-access-ftmdv") pod "bb03378b-77b3-44c8-97eb-558868797b23" (UID: "bb03378b-77b3-44c8-97eb-558868797b23"). InnerVolumeSpecName "kube-api-access-ftmdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.356138 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb03378b-77b3-44c8-97eb-558868797b23" (UID: "bb03378b-77b3-44c8-97eb-558868797b23"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.356745 4918 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.356769 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftmdv\" (UniqueName: \"kubernetes.io/projected/bb03378b-77b3-44c8-97eb-558868797b23-kube-api-access-ftmdv\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.372096 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-config" (OuterVolumeSpecName: "config") pod "bb03378b-77b3-44c8-97eb-558868797b23" (UID: "bb03378b-77b3-44c8-97eb-558868797b23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.384515 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb03378b-77b3-44c8-97eb-558868797b23" (UID: "bb03378b-77b3-44c8-97eb-558868797b23"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.388967 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb03378b-77b3-44c8-97eb-558868797b23" (UID: "bb03378b-77b3-44c8-97eb-558868797b23"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.410032 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb03378b-77b3-44c8-97eb-558868797b23" (UID: "bb03378b-77b3-44c8-97eb-558868797b23"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.458210 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.458245 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.458256 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.458264 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb03378b-77b3-44c8-97eb-558868797b23-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.827130 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" event={"ID":"d431095e-2467-47c8-8288-ef25e31bed1e","Type":"ContainerStarted","Data":"207033af3203a5061663cc43d4d79935f8a3518a8f04a97412f03a438bbd3949"} Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.832955 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" event={"ID":"bb03378b-77b3-44c8-97eb-558868797b23","Type":"ContainerDied","Data":"1a288324b81aa1645f5a582b075c14b40c894adf3cfec812d0fe0cd7f25170ed"} Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.833004 4918 scope.go:117] "RemoveContainer" containerID="1416871aa3f03c85b78ec34a93f66e562966e8e95eb12cd7a99f7c5d4afb93dc" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.833064 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.878971 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8xw6t"] Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.884594 4918 scope.go:117] "RemoveContainer" containerID="a54debd2d584f5dd7a64dbe4333b64015d3eb45944209d5ab905d10610eb1ef0" Mar 19 17:05:04 crc kubenswrapper[4918]: I0319 17:05:04.892120 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-8xw6t"] Mar 19 17:05:05 crc kubenswrapper[4918]: I0319 17:05:05.741463 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 17:05:05 crc kubenswrapper[4918]: I0319 17:05:05.843196 4918 generic.go:334] "Generic (PLEG): container finished" podID="d431095e-2467-47c8-8288-ef25e31bed1e" containerID="9451ff94316e710d4c93e10c5762d3fcdc18ec489dbc482a23de13faddef2d52" exitCode=0 Mar 19 17:05:05 crc kubenswrapper[4918]: I0319 17:05:05.843279 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" event={"ID":"d431095e-2467-47c8-8288-ef25e31bed1e","Type":"ContainerDied","Data":"9451ff94316e710d4c93e10c5762d3fcdc18ec489dbc482a23de13faddef2d52"} Mar 19 17:05:06 crc kubenswrapper[4918]: I0319 17:05:06.597833 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb03378b-77b3-44c8-97eb-558868797b23" path="/var/lib/kubelet/pods/bb03378b-77b3-44c8-97eb-558868797b23/volumes" Mar 19 17:05:06 crc kubenswrapper[4918]: I0319 17:05:06.857268 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" event={"ID":"d431095e-2467-47c8-8288-ef25e31bed1e","Type":"ContainerStarted","Data":"393e0c6db82ed1378ac34248f7c48de21b3322083f5b9d7fe1d3686d9fa740fe"} Mar 19 17:05:06 crc kubenswrapper[4918]: I0319 17:05:06.858511 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:06 crc kubenswrapper[4918]: I0319 17:05:06.860429 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-cm55x" event={"ID":"79829597-ef66-4d6f-946f-adaa9ec3d227","Type":"ContainerStarted","Data":"94a1f4ca00cd7a79f0f012ec819ea8ce40dee0795880662a9c6f9f681e236859"} Mar 19 17:05:06 crc kubenswrapper[4918]: I0319 17:05:06.893329 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" podStartSLOduration=3.893305415 podStartE2EDuration="3.893305415s" podCreationTimestamp="2026-03-19 17:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:05:06.883910658 +0000 UTC m=+1519.006109896" watchObservedRunningTime="2026-03-19 17:05:06.893305415 +0000 UTC m=+1519.015504663" Mar 19 17:05:06 crc kubenswrapper[4918]: I0319 17:05:06.907362 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-cm55x" podStartSLOduration=2.167464146 podStartE2EDuration="44.907340998s" podCreationTimestamp="2026-03-19 17:04:22 +0000 UTC" firstStartedPulling="2026-03-19 17:04:22.998614107 +0000 UTC m=+1475.120813355" lastFinishedPulling="2026-03-19 17:05:05.738490959 +0000 UTC m=+1517.860690207" observedRunningTime="2026-03-19 17:05:06.906801584 +0000 UTC m=+1519.029000832" watchObservedRunningTime="2026-03-19 17:05:06.907340998 +0000 UTC m=+1519.029540246" Mar 19 17:05:08 crc kubenswrapper[4918]: I0319 17:05:08.773852 4918 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5fd9b586ff-8xw6t" podUID="bb03378b-77b3-44c8-97eb-558868797b23" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.232:5353: i/o timeout" Mar 19 17:05:08 crc kubenswrapper[4918]: I0319 17:05:08.884099 4918 generic.go:334] "Generic (PLEG): container finished" podID="79829597-ef66-4d6f-946f-adaa9ec3d227" containerID="94a1f4ca00cd7a79f0f012ec819ea8ce40dee0795880662a9c6f9f681e236859" exitCode=0 Mar 19 17:05:08 crc kubenswrapper[4918]: I0319 17:05:08.884136 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-cm55x" event={"ID":"79829597-ef66-4d6f-946f-adaa9ec3d227","Type":"ContainerDied","Data":"94a1f4ca00cd7a79f0f012ec819ea8ce40dee0795880662a9c6f9f681e236859"} Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.744098 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.885996 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-certs\") pod \"79829597-ef66-4d6f-946f-adaa9ec3d227\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.886040 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-combined-ca-bundle\") pod \"79829597-ef66-4d6f-946f-adaa9ec3d227\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.886176 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8fj4\" (UniqueName: \"kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-kube-api-access-x8fj4\") pod \"79829597-ef66-4d6f-946f-adaa9ec3d227\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.886260 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-scripts\") pod \"79829597-ef66-4d6f-946f-adaa9ec3d227\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.886285 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-config-data\") pod \"79829597-ef66-4d6f-946f-adaa9ec3d227\" (UID: \"79829597-ef66-4d6f-946f-adaa9ec3d227\") " Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.891564 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-scripts" (OuterVolumeSpecName: "scripts") pod "79829597-ef66-4d6f-946f-adaa9ec3d227" (UID: "79829597-ef66-4d6f-946f-adaa9ec3d227"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.892796 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-kube-api-access-x8fj4" (OuterVolumeSpecName: "kube-api-access-x8fj4") pod "79829597-ef66-4d6f-946f-adaa9ec3d227" (UID: "79829597-ef66-4d6f-946f-adaa9ec3d227"). InnerVolumeSpecName "kube-api-access-x8fj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.902469 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-cm55x" event={"ID":"79829597-ef66-4d6f-946f-adaa9ec3d227","Type":"ContainerDied","Data":"1a4ab0f85b6dee62eeb72c6ad024a5912ff65209b3d79553a824c4a95fdbe9d4"} Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.902530 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a4ab0f85b6dee62eeb72c6ad024a5912ff65209b3d79553a824c4a95fdbe9d4" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.902594 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-cm55x" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.902839 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-certs" (OuterVolumeSpecName: "certs") pod "79829597-ef66-4d6f-946f-adaa9ec3d227" (UID: "79829597-ef66-4d6f-946f-adaa9ec3d227"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.925634 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-config-data" (OuterVolumeSpecName: "config-data") pod "79829597-ef66-4d6f-946f-adaa9ec3d227" (UID: "79829597-ef66-4d6f-946f-adaa9ec3d227"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.935412 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79829597-ef66-4d6f-946f-adaa9ec3d227" (UID: "79829597-ef66-4d6f-946f-adaa9ec3d227"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.989183 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.989217 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.989228 4918 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.989240 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79829597-ef66-4d6f-946f-adaa9ec3d227-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:10 crc kubenswrapper[4918]: I0319 17:05:10.989255 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8fj4\" (UniqueName: \"kubernetes.io/projected/79829597-ef66-4d6f-946f-adaa9ec3d227-kube-api-access-x8fj4\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:11 crc kubenswrapper[4918]: I0319 17:05:11.848378 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-x77xk"] Mar 19 17:05:11 crc kubenswrapper[4918]: I0319 17:05:11.860600 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-x77xk"] Mar 19 17:05:11 crc kubenswrapper[4918]: I0319 17:05:11.941237 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-864ls"] Mar 19 17:05:11 crc kubenswrapper[4918]: E0319 17:05:11.941893 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb03378b-77b3-44c8-97eb-558868797b23" containerName="init" Mar 19 17:05:11 crc kubenswrapper[4918]: I0319 17:05:11.941916 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb03378b-77b3-44c8-97eb-558868797b23" containerName="init" Mar 19 17:05:11 crc kubenswrapper[4918]: E0319 17:05:11.941949 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb03378b-77b3-44c8-97eb-558868797b23" containerName="dnsmasq-dns" Mar 19 17:05:11 crc kubenswrapper[4918]: I0319 17:05:11.941956 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb03378b-77b3-44c8-97eb-558868797b23" containerName="dnsmasq-dns" Mar 19 17:05:11 crc kubenswrapper[4918]: E0319 17:05:11.941968 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79829597-ef66-4d6f-946f-adaa9ec3d227" containerName="cloudkitty-db-sync" Mar 19 17:05:11 crc kubenswrapper[4918]: I0319 17:05:11.941975 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="79829597-ef66-4d6f-946f-adaa9ec3d227" containerName="cloudkitty-db-sync" Mar 19 17:05:11 crc kubenswrapper[4918]: I0319 17:05:11.942175 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb03378b-77b3-44c8-97eb-558868797b23" containerName="dnsmasq-dns" Mar 19 17:05:11 crc kubenswrapper[4918]: I0319 17:05:11.942195 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="79829597-ef66-4d6f-946f-adaa9ec3d227" containerName="cloudkitty-db-sync" Mar 19 17:05:11 crc kubenswrapper[4918]: I0319 17:05:11.942951 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:11 crc kubenswrapper[4918]: I0319 17:05:11.944854 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 17:05:11 crc kubenswrapper[4918]: I0319 17:05:11.973054 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-864ls"] Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.131272 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-combined-ca-bundle\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.131347 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbp59\" (UniqueName: \"kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-kube-api-access-hbp59\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.131466 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-config-data\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.131490 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-scripts\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.131543 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-certs\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.234002 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-config-data\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.234058 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-scripts\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.234110 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-certs\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.234263 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-combined-ca-bundle\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.234310 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbp59\" (UniqueName: \"kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-kube-api-access-hbp59\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.242267 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-config-data\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.248303 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-combined-ca-bundle\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.249635 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-scripts\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.279058 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbp59\" (UniqueName: \"kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-kube-api-access-hbp59\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.282958 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-certs\") pod \"cloudkitty-storageinit-864ls\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.572350 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:12 crc kubenswrapper[4918]: I0319 17:05:12.598961 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e41394-829e-4305-a4cf-0e35a37839a7" path="/var/lib/kubelet/pods/84e41394-829e-4305-a4cf-0e35a37839a7/volumes" Mar 19 17:05:13 crc kubenswrapper[4918]: I0319 17:05:13.080562 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-864ls"] Mar 19 17:05:13 crc kubenswrapper[4918]: I0319 17:05:13.630640 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-ktkwd" Mar 19 17:05:13 crc kubenswrapper[4918]: I0319 17:05:13.711988 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-f2th5"] Mar 19 17:05:13 crc kubenswrapper[4918]: I0319 17:05:13.712229 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" podUID="7486913b-4d6b-4dde-804d-3525ac608497" containerName="dnsmasq-dns" containerID="cri-o://3651a39c41b2c8c62e212aebe04b488427a81062da2ef38daeeb117fcec56bc8" gracePeriod=10 Mar 19 17:05:13 crc kubenswrapper[4918]: I0319 17:05:13.945743 4918 generic.go:334] "Generic (PLEG): container finished" podID="7486913b-4d6b-4dde-804d-3525ac608497" containerID="3651a39c41b2c8c62e212aebe04b488427a81062da2ef38daeeb117fcec56bc8" exitCode=0 Mar 19 17:05:13 crc kubenswrapper[4918]: I0319 17:05:13.946110 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" event={"ID":"7486913b-4d6b-4dde-804d-3525ac608497","Type":"ContainerDied","Data":"3651a39c41b2c8c62e212aebe04b488427a81062da2ef38daeeb117fcec56bc8"} Mar 19 17:05:13 crc kubenswrapper[4918]: I0319 17:05:13.948272 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-864ls" event={"ID":"fe6642e7-ede8-4dd3-8248-648f77d558b3","Type":"ContainerStarted","Data":"27a6cf600dc0f973764c2f57988e9993a1b4528b4d219b9a33889a3f9b07afc3"} Mar 19 17:05:13 crc kubenswrapper[4918]: I0319 17:05:13.948300 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-864ls" event={"ID":"fe6642e7-ede8-4dd3-8248-648f77d558b3","Type":"ContainerStarted","Data":"990cba976ad40ed31c63033e47dbc0e927c5b9d77c604615599675ab7982800c"} Mar 19 17:05:13 crc kubenswrapper[4918]: I0319 17:05:13.980566 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-864ls" podStartSLOduration=2.980546308 podStartE2EDuration="2.980546308s" podCreationTimestamp="2026-03-19 17:05:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:05:13.967729878 +0000 UTC m=+1526.089929136" watchObservedRunningTime="2026-03-19 17:05:13.980546308 +0000 UTC m=+1526.102745566" Mar 19 17:05:14 crc kubenswrapper[4918]: I0319 17:05:14.712259 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:05:14 crc kubenswrapper[4918]: I0319 17:05:14.891298 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-sb\") pod \"7486913b-4d6b-4dde-804d-3525ac608497\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " Mar 19 17:05:14 crc kubenswrapper[4918]: I0319 17:05:14.891443 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-swift-storage-0\") pod \"7486913b-4d6b-4dde-804d-3525ac608497\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " Mar 19 17:05:14 crc kubenswrapper[4918]: I0319 17:05:14.891500 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqjz4\" (UniqueName: \"kubernetes.io/projected/7486913b-4d6b-4dde-804d-3525ac608497-kube-api-access-pqjz4\") pod \"7486913b-4d6b-4dde-804d-3525ac608497\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " Mar 19 17:05:14 crc kubenswrapper[4918]: I0319 17:05:14.891549 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-nb\") pod \"7486913b-4d6b-4dde-804d-3525ac608497\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " Mar 19 17:05:14 crc kubenswrapper[4918]: I0319 17:05:14.891637 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-config\") pod \"7486913b-4d6b-4dde-804d-3525ac608497\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " Mar 19 17:05:14 crc kubenswrapper[4918]: I0319 17:05:14.891714 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-svc\") pod \"7486913b-4d6b-4dde-804d-3525ac608497\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " Mar 19 17:05:14 crc kubenswrapper[4918]: I0319 17:05:14.891768 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-openstack-edpm-ipam\") pod \"7486913b-4d6b-4dde-804d-3525ac608497\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " Mar 19 17:05:14 crc kubenswrapper[4918]: I0319 17:05:14.938734 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7486913b-4d6b-4dde-804d-3525ac608497-kube-api-access-pqjz4" (OuterVolumeSpecName: "kube-api-access-pqjz4") pod "7486913b-4d6b-4dde-804d-3525ac608497" (UID: "7486913b-4d6b-4dde-804d-3525ac608497"). InnerVolumeSpecName "kube-api-access-pqjz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:14 crc kubenswrapper[4918]: I0319 17:05:14.994793 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqjz4\" (UniqueName: \"kubernetes.io/projected/7486913b-4d6b-4dde-804d-3525ac608497-kube-api-access-pqjz4\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:14.997932 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7486913b-4d6b-4dde-804d-3525ac608497" (UID: "7486913b-4d6b-4dde-804d-3525ac608497"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.023647 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.023956 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-f2th5" event={"ID":"7486913b-4d6b-4dde-804d-3525ac608497","Type":"ContainerDied","Data":"b87f7a89125188229b0311582fb536f0b51105a75e59cb7d1b09ea76b6fb7a53"} Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.024030 4918 scope.go:117] "RemoveContainer" containerID="3651a39c41b2c8c62e212aebe04b488427a81062da2ef38daeeb117fcec56bc8" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.055692 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-config" (OuterVolumeSpecName: "config") pod "7486913b-4d6b-4dde-804d-3525ac608497" (UID: "7486913b-4d6b-4dde-804d-3525ac608497"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.060944 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7486913b-4d6b-4dde-804d-3525ac608497" (UID: "7486913b-4d6b-4dde-804d-3525ac608497"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.068047 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7486913b-4d6b-4dde-804d-3525ac608497" (UID: "7486913b-4d6b-4dde-804d-3525ac608497"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:05:15 crc kubenswrapper[4918]: E0319 17:05:15.081279 4918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-swift-storage-0 podName:7486913b-4d6b-4dde-804d-3525ac608497 nodeName:}" failed. No retries permitted until 2026-03-19 17:05:15.581248165 +0000 UTC m=+1527.703447413 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-swift-storage-0") pod "7486913b-4d6b-4dde-804d-3525ac608497" (UID: "7486913b-4d6b-4dde-804d-3525ac608497") : error deleting /var/lib/kubelet/pods/7486913b-4d6b-4dde-804d-3525ac608497/volume-subpaths: remove /var/lib/kubelet/pods/7486913b-4d6b-4dde-804d-3525ac608497/volume-subpaths: no such file or directory Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.081450 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "7486913b-4d6b-4dde-804d-3525ac608497" (UID: "7486913b-4d6b-4dde-804d-3525ac608497"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.097205 4918 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.097242 4918 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.097256 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.097267 4918 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.097277 4918 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-config\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.125459 4918 scope.go:117] "RemoveContainer" containerID="65fa08bc2e1272564937199067d76a8b221f320c1fb84bcd2c542128f815c9b7" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.608098 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-swift-storage-0\") pod \"7486913b-4d6b-4dde-804d-3525ac608497\" (UID: \"7486913b-4d6b-4dde-804d-3525ac608497\") " Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.608709 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7486913b-4d6b-4dde-804d-3525ac608497" (UID: "7486913b-4d6b-4dde-804d-3525ac608497"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.655589 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-f2th5"] Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.675567 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-f2th5"] Mar 19 17:05:15 crc kubenswrapper[4918]: I0319 17:05:15.710962 4918 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7486913b-4d6b-4dde-804d-3525ac608497-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:16 crc kubenswrapper[4918]: I0319 17:05:16.036088 4918 generic.go:334] "Generic (PLEG): container finished" podID="fe6642e7-ede8-4dd3-8248-648f77d558b3" containerID="27a6cf600dc0f973764c2f57988e9993a1b4528b4d219b9a33889a3f9b07afc3" exitCode=0 Mar 19 17:05:16 crc kubenswrapper[4918]: I0319 17:05:16.036130 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-864ls" event={"ID":"fe6642e7-ede8-4dd3-8248-648f77d558b3","Type":"ContainerDied","Data":"27a6cf600dc0f973764c2f57988e9993a1b4528b4d219b9a33889a3f9b07afc3"} Mar 19 17:05:16 crc kubenswrapper[4918]: I0319 17:05:16.597403 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7486913b-4d6b-4dde-804d-3525ac608497" path="/var/lib/kubelet/pods/7486913b-4d6b-4dde-804d-3525ac608497/volumes" Mar 19 17:05:17 crc kubenswrapper[4918]: I0319 17:05:17.896246 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.083968 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-864ls" event={"ID":"fe6642e7-ede8-4dd3-8248-648f77d558b3","Type":"ContainerDied","Data":"990cba976ad40ed31c63033e47dbc0e927c5b9d77c604615599675ab7982800c"} Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.084048 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="990cba976ad40ed31c63033e47dbc0e927c5b9d77c604615599675ab7982800c" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.084120 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-864ls" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.085636 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-config-data\") pod \"fe6642e7-ede8-4dd3-8248-648f77d558b3\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.085748 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-scripts\") pod \"fe6642e7-ede8-4dd3-8248-648f77d558b3\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.085813 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-combined-ca-bundle\") pod \"fe6642e7-ede8-4dd3-8248-648f77d558b3\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.085884 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbp59\" (UniqueName: \"kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-kube-api-access-hbp59\") pod \"fe6642e7-ede8-4dd3-8248-648f77d558b3\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.086009 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-certs\") pod \"fe6642e7-ede8-4dd3-8248-648f77d558b3\" (UID: \"fe6642e7-ede8-4dd3-8248-648f77d558b3\") " Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.114604 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-scripts" (OuterVolumeSpecName: "scripts") pod "fe6642e7-ede8-4dd3-8248-648f77d558b3" (UID: "fe6642e7-ede8-4dd3-8248-648f77d558b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.127961 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-kube-api-access-hbp59" (OuterVolumeSpecName: "kube-api-access-hbp59") pod "fe6642e7-ede8-4dd3-8248-648f77d558b3" (UID: "fe6642e7-ede8-4dd3-8248-648f77d558b3"). InnerVolumeSpecName "kube-api-access-hbp59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.129394 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-certs" (OuterVolumeSpecName: "certs") pod "fe6642e7-ede8-4dd3-8248-648f77d558b3" (UID: "fe6642e7-ede8-4dd3-8248-648f77d558b3"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.180545 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.180816 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="9bb4650a-c965-49da-b13a-9e998a165c45" containerName="cloudkitty-proc" containerID="cri-o://fd476c48bb042b4ff2ca594afe44e17b85b66ad715939f92688ab8a3bc41e97b" gracePeriod=30 Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.186854 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe6642e7-ede8-4dd3-8248-648f77d558b3" (UID: "fe6642e7-ede8-4dd3-8248-648f77d558b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.188412 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.188439 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.188454 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbp59\" (UniqueName: \"kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-kube-api-access-hbp59\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.188735 4918 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fe6642e7-ede8-4dd3-8248-648f77d558b3-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.215803 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.216827 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerName="cloudkitty-api-log" containerID="cri-o://aaeca65a1801fb6477b370ed8514a87bf222d85460413a9055d3fa78c502c3d2" gracePeriod=30 Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.216980 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-config-data" (OuterVolumeSpecName: "config-data") pod "fe6642e7-ede8-4dd3-8248-648f77d558b3" (UID: "fe6642e7-ede8-4dd3-8248-648f77d558b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.217145 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerName="cloudkitty-api" containerID="cri-o://3fe92d05181a2da3a3dab7268975c01c3cfb08b09628954b39e89f2e0631ba2a" gracePeriod=30 Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.290749 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6642e7-ede8-4dd3-8248-648f77d558b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:18 crc kubenswrapper[4918]: I0319 17:05:18.647083 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 17:05:19 crc kubenswrapper[4918]: I0319 17:05:19.095925 4918 generic.go:334] "Generic (PLEG): container finished" podID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerID="aaeca65a1801fb6477b370ed8514a87bf222d85460413a9055d3fa78c502c3d2" exitCode=143 Mar 19 17:05:19 crc kubenswrapper[4918]: I0319 17:05:19.095991 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f","Type":"ContainerDied","Data":"aaeca65a1801fb6477b370ed8514a87bf222d85460413a9055d3fa78c502c3d2"} Mar 19 17:05:19 crc kubenswrapper[4918]: I0319 17:05:19.097692 4918 generic.go:334] "Generic (PLEG): container finished" podID="9bb4650a-c965-49da-b13a-9e998a165c45" containerID="fd476c48bb042b4ff2ca594afe44e17b85b66ad715939f92688ab8a3bc41e97b" exitCode=0 Mar 19 17:05:19 crc kubenswrapper[4918]: I0319 17:05:19.097730 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"9bb4650a-c965-49da-b13a-9e998a165c45","Type":"ContainerDied","Data":"fd476c48bb042b4ff2ca594afe44e17b85b66ad715939f92688ab8a3bc41e97b"} Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.183193 4918 generic.go:334] "Generic (PLEG): container finished" podID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerID="3fe92d05181a2da3a3dab7268975c01c3cfb08b09628954b39e89f2e0631ba2a" exitCode=0 Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.183357 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f","Type":"ContainerDied","Data":"3fe92d05181a2da3a3dab7268975c01c3cfb08b09628954b39e89f2e0631ba2a"} Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.200141 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0189fc1-60b5-4734-a4b2-aa1714795f50","Type":"ContainerStarted","Data":"608d321a5d34fc17ef711080a091c90a72ab7d6d3d53445270c369d47b223558"} Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.222761 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.20973684 podStartE2EDuration="51.222744358s" podCreationTimestamp="2026-03-19 17:04:29 +0000 UTC" firstStartedPulling="2026-03-19 17:04:30.167338087 +0000 UTC m=+1482.289537335" lastFinishedPulling="2026-03-19 17:05:19.180345595 +0000 UTC m=+1531.302544853" observedRunningTime="2026-03-19 17:05:20.221387641 +0000 UTC m=+1532.343586879" watchObservedRunningTime="2026-03-19 17:05:20.222744358 +0000 UTC m=+1532.344943606" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.464338 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.537208 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data-custom\") pod \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.537302 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-certs\") pod \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.537394 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data\") pod \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.537565 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-combined-ca-bundle\") pod \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.537638 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-scripts\") pod \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.537669 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-public-tls-certs\") pod \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.537699 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-internal-tls-certs\") pod \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.537728 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-logs\") pod \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.537768 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9jjh\" (UniqueName: \"kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-kube-api-access-x9jjh\") pod \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\" (UID: \"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.543959 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-logs" (OuterVolumeSpecName: "logs") pod "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" (UID: "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.544184 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-kube-api-access-x9jjh" (OuterVolumeSpecName: "kube-api-access-x9jjh") pod "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" (UID: "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f"). InnerVolumeSpecName "kube-api-access-x9jjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.551926 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-certs" (OuterVolumeSpecName: "certs") pod "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" (UID: "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.551982 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-scripts" (OuterVolumeSpecName: "scripts") pod "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" (UID: "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.575978 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data" (OuterVolumeSpecName: "config-data") pod "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" (UID: "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.577983 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" (UID: "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.594695 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" (UID: "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.640238 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.640273 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.640284 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.640292 4918 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-logs\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.640301 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9jjh\" (UniqueName: \"kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-kube-api-access-x9jjh\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.640310 4918 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.640317 4918 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.648080 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" (UID: "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.667514 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" (UID: "b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.742189 4918 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.742477 4918 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.889323 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.945910 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-certs\") pod \"9bb4650a-c965-49da-b13a-9e998a165c45\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.945973 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-combined-ca-bundle\") pod \"9bb4650a-c965-49da-b13a-9e998a165c45\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.946058 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dc5x\" (UniqueName: \"kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-kube-api-access-2dc5x\") pod \"9bb4650a-c965-49da-b13a-9e998a165c45\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.946109 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-scripts\") pod \"9bb4650a-c965-49da-b13a-9e998a165c45\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.946187 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data-custom\") pod \"9bb4650a-c965-49da-b13a-9e998a165c45\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.946239 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data\") pod \"9bb4650a-c965-49da-b13a-9e998a165c45\" (UID: \"9bb4650a-c965-49da-b13a-9e998a165c45\") " Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.949851 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9bb4650a-c965-49da-b13a-9e998a165c45" (UID: "9bb4650a-c965-49da-b13a-9e998a165c45"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.952540 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-scripts" (OuterVolumeSpecName: "scripts") pod "9bb4650a-c965-49da-b13a-9e998a165c45" (UID: "9bb4650a-c965-49da-b13a-9e998a165c45"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.955955 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-certs" (OuterVolumeSpecName: "certs") pod "9bb4650a-c965-49da-b13a-9e998a165c45" (UID: "9bb4650a-c965-49da-b13a-9e998a165c45"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.969113 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-kube-api-access-2dc5x" (OuterVolumeSpecName: "kube-api-access-2dc5x") pod "9bb4650a-c965-49da-b13a-9e998a165c45" (UID: "9bb4650a-c965-49da-b13a-9e998a165c45"). InnerVolumeSpecName "kube-api-access-2dc5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.983455 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data" (OuterVolumeSpecName: "config-data") pod "9bb4650a-c965-49da-b13a-9e998a165c45" (UID: "9bb4650a-c965-49da-b13a-9e998a165c45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:20 crc kubenswrapper[4918]: I0319 17:05:20.985365 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bb4650a-c965-49da-b13a-9e998a165c45" (UID: "9bb4650a-c965-49da-b13a-9e998a165c45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.052464 4918 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-certs\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.052538 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.052556 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dc5x\" (UniqueName: \"kubernetes.io/projected/9bb4650a-c965-49da-b13a-9e998a165c45-kube-api-access-2dc5x\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.052568 4918 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.052579 4918 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.052589 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bb4650a-c965-49da-b13a-9e998a165c45-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.217271 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f","Type":"ContainerDied","Data":"e5e4aff2621423e71599e294893a770435002224f4a12d3fa4f54e0a577eb764"} Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.217332 4918 scope.go:117] "RemoveContainer" containerID="3fe92d05181a2da3a3dab7268975c01c3cfb08b09628954b39e89f2e0631ba2a" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.217490 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.221751 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.221964 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"9bb4650a-c965-49da-b13a-9e998a165c45","Type":"ContainerDied","Data":"5c11d4e8950988eb8aa750281f37e0c83d87888f716aaae835ce79d511697c64"} Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.272987 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.287824 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.308588 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.322106 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335008 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:05:21 crc kubenswrapper[4918]: E0319 17:05:21.335491 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7486913b-4d6b-4dde-804d-3525ac608497" containerName="dnsmasq-dns" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335510 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7486913b-4d6b-4dde-804d-3525ac608497" containerName="dnsmasq-dns" Mar 19 17:05:21 crc kubenswrapper[4918]: E0319 17:05:21.335538 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7486913b-4d6b-4dde-804d-3525ac608497" containerName="init" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335547 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7486913b-4d6b-4dde-804d-3525ac608497" containerName="init" Mar 19 17:05:21 crc kubenswrapper[4918]: E0319 17:05:21.335566 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerName="cloudkitty-api" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335575 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerName="cloudkitty-api" Mar 19 17:05:21 crc kubenswrapper[4918]: E0319 17:05:21.335589 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerName="cloudkitty-api-log" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335595 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerName="cloudkitty-api-log" Mar 19 17:05:21 crc kubenswrapper[4918]: E0319 17:05:21.335611 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6642e7-ede8-4dd3-8248-648f77d558b3" containerName="cloudkitty-storageinit" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335617 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6642e7-ede8-4dd3-8248-648f77d558b3" containerName="cloudkitty-storageinit" Mar 19 17:05:21 crc kubenswrapper[4918]: E0319 17:05:21.335637 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb4650a-c965-49da-b13a-9e998a165c45" containerName="cloudkitty-proc" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335642 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb4650a-c965-49da-b13a-9e998a165c45" containerName="cloudkitty-proc" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335859 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6642e7-ede8-4dd3-8248-648f77d558b3" containerName="cloudkitty-storageinit" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335878 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerName="cloudkitty-api" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335892 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" containerName="cloudkitty-api-log" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335905 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb4650a-c965-49da-b13a-9e998a165c45" containerName="cloudkitty-proc" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.335921 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7486913b-4d6b-4dde-804d-3525ac608497" containerName="dnsmasq-dns" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.336968 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.339050 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.339250 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.339396 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-kxbs6" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.343327 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.344231 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.344246 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.345206 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.389932 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.417700 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.429737 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.432381 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.440574 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.464542 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-scripts\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.464628 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52bf6e94-a8b5-406a-a69e-39b883fa847d-certs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.464749 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.464790 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.464866 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52bf6e94-a8b5-406a-a69e-39b883fa847d-logs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.464894 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-config-data\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.464990 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.465051 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-557sp\" (UniqueName: \"kubernetes.io/projected/52bf6e94-a8b5-406a-a69e-39b883fa847d-kube-api-access-557sp\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.465183 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.525672 4918 scope.go:117] "RemoveContainer" containerID="aaeca65a1801fb6477b370ed8514a87bf222d85460413a9055d3fa78c502c3d2" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.567835 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52bf6e94-a8b5-406a-a69e-39b883fa847d-logs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.567882 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbxh9\" (UniqueName: \"kubernetes.io/projected/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-kube-api-access-mbxh9\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.567918 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-config-data\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.567975 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-config-data\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.567999 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-certs\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.568075 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.568105 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-557sp\" (UniqueName: \"kubernetes.io/projected/52bf6e94-a8b5-406a-a69e-39b883fa847d-kube-api-access-557sp\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.568133 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.568198 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-scripts\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.568223 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-scripts\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.568238 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52bf6e94-a8b5-406a-a69e-39b883fa847d-certs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.568265 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.568305 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.568323 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.568337 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.568420 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52bf6e94-a8b5-406a-a69e-39b883fa847d-logs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.572904 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.573160 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.573354 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.573693 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.573741 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-config-data\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.574565 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52bf6e94-a8b5-406a-a69e-39b883fa847d-scripts\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.575448 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52bf6e94-a8b5-406a-a69e-39b883fa847d-certs\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.591395 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-557sp\" (UniqueName: \"kubernetes.io/projected/52bf6e94-a8b5-406a-a69e-39b883fa847d-kube-api-access-557sp\") pod \"cloudkitty-api-0\" (UID: \"52bf6e94-a8b5-406a-a69e-39b883fa847d\") " pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.633023 4918 scope.go:117] "RemoveContainer" containerID="fd476c48bb042b4ff2ca594afe44e17b85b66ad715939f92688ab8a3bc41e97b" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.661143 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.670045 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-scripts\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.670103 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.670204 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.670288 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbxh9\" (UniqueName: \"kubernetes.io/projected/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-kube-api-access-mbxh9\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.670328 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-config-data\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.670358 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-certs\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.673098 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.674181 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.674464 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-config-data\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.674940 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-scripts\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.677201 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-certs\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.687653 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbxh9\" (UniqueName: \"kubernetes.io/projected/bdb2b1c6-814d-466f-a51f-07b6440ac7ea-kube-api-access-mbxh9\") pod \"cloudkitty-proc-0\" (UID: \"bdb2b1c6-814d-466f-a51f-07b6440ac7ea\") " pod="openstack/cloudkitty-proc-0" Mar 19 17:05:21 crc kubenswrapper[4918]: I0319 17:05:21.826144 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Mar 19 17:05:22 crc kubenswrapper[4918]: I0319 17:05:22.207190 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Mar 19 17:05:22 crc kubenswrapper[4918]: W0319 17:05:22.210058 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52bf6e94_a8b5_406a_a69e_39b883fa847d.slice/crio-62f0a64a52ccff391ec8d0c718d94b668fb0445f1472c28d3c9ff38358c2bc84 WatchSource:0}: Error finding container 62f0a64a52ccff391ec8d0c718d94b668fb0445f1472c28d3c9ff38358c2bc84: Status 404 returned error can't find the container with id 62f0a64a52ccff391ec8d0c718d94b668fb0445f1472c28d3c9ff38358c2bc84 Mar 19 17:05:22 crc kubenswrapper[4918]: I0319 17:05:22.257280 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52bf6e94-a8b5-406a-a69e-39b883fa847d","Type":"ContainerStarted","Data":"62f0a64a52ccff391ec8d0c718d94b668fb0445f1472c28d3c9ff38358c2bc84"} Mar 19 17:05:22 crc kubenswrapper[4918]: W0319 17:05:22.578088 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdb2b1c6_814d_466f_a51f_07b6440ac7ea.slice/crio-744c76d8171ac8bda268b8c8c8ea3d7d00f01e5874f4483b1c183fb81b93aca7 WatchSource:0}: Error finding container 744c76d8171ac8bda268b8c8c8ea3d7d00f01e5874f4483b1c183fb81b93aca7: Status 404 returned error can't find the container with id 744c76d8171ac8bda268b8c8c8ea3d7d00f01e5874f4483b1c183fb81b93aca7 Mar 19 17:05:22 crc kubenswrapper[4918]: I0319 17:05:22.605777 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb4650a-c965-49da-b13a-9e998a165c45" path="/var/lib/kubelet/pods/9bb4650a-c965-49da-b13a-9e998a165c45/volumes" Mar 19 17:05:22 crc kubenswrapper[4918]: I0319 17:05:22.606798 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f" path="/var/lib/kubelet/pods/b04dda7a-c3b1-4e81-8f2e-95884a0fcc4f/volumes" Mar 19 17:05:22 crc kubenswrapper[4918]: I0319 17:05:22.607578 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Mar 19 17:05:23 crc kubenswrapper[4918]: I0319 17:05:23.274937 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"bdb2b1c6-814d-466f-a51f-07b6440ac7ea","Type":"ContainerStarted","Data":"744c76d8171ac8bda268b8c8c8ea3d7d00f01e5874f4483b1c183fb81b93aca7"} Mar 19 17:05:23 crc kubenswrapper[4918]: I0319 17:05:23.277616 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52bf6e94-a8b5-406a-a69e-39b883fa847d","Type":"ContainerStarted","Data":"7645e99eaaa61ce083337f8dda66805e4e36ea1b0c9817fe1acea6672f587790"} Mar 19 17:05:23 crc kubenswrapper[4918]: I0319 17:05:23.277723 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52bf6e94-a8b5-406a-a69e-39b883fa847d","Type":"ContainerStarted","Data":"43c3cd2b53dee5280791c6c99a7bde96250b1c0f6e607d1c18be69fc6456f235"} Mar 19 17:05:23 crc kubenswrapper[4918]: I0319 17:05:23.277960 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Mar 19 17:05:23 crc kubenswrapper[4918]: I0319 17:05:23.318470 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.31845131 podStartE2EDuration="2.31845131s" podCreationTimestamp="2026-03-19 17:05:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:05:23.307150251 +0000 UTC m=+1535.429349499" watchObservedRunningTime="2026-03-19 17:05:23.31845131 +0000 UTC m=+1535.440650558" Mar 19 17:05:25 crc kubenswrapper[4918]: I0319 17:05:25.997513 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg"] Mar 19 17:05:25 crc kubenswrapper[4918]: I0319 17:05:25.999316 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.001218 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.001314 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.001981 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.002806 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.044600 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg"] Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.082248 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.082491 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.082832 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.083091 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk2x8\" (UniqueName: \"kubernetes.io/projected/0a3d3e76-de48-44f2-a34e-021196a21f5b-kube-api-access-hk2x8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.185418 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.185511 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.185670 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk2x8\" (UniqueName: \"kubernetes.io/projected/0a3d3e76-de48-44f2-a34e-021196a21f5b-kube-api-access-hk2x8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.186838 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.198759 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.202030 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.202212 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.211331 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk2x8\" (UniqueName: \"kubernetes.io/projected/0a3d3e76-de48-44f2-a34e-021196a21f5b-kube-api-access-hk2x8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:26 crc kubenswrapper[4918]: I0319 17:05:26.320378 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:27 crc kubenswrapper[4918]: I0319 17:05:27.320435 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"bdb2b1c6-814d-466f-a51f-07b6440ac7ea","Type":"ContainerStarted","Data":"fcdf8f94fa1399c3a89efb1a40dba4cd9d7b2f9b6ec1277c727cc4b49428b306"} Mar 19 17:05:27 crc kubenswrapper[4918]: I0319 17:05:27.337839 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.896793637 podStartE2EDuration="6.337813062s" podCreationTimestamp="2026-03-19 17:05:21 +0000 UTC" firstStartedPulling="2026-03-19 17:05:22.579775492 +0000 UTC m=+1534.701974750" lastFinishedPulling="2026-03-19 17:05:26.020765556 +0000 UTC m=+1538.142994175" observedRunningTime="2026-03-19 17:05:27.335359635 +0000 UTC m=+1539.457558923" watchObservedRunningTime="2026-03-19 17:05:27.337813062 +0000 UTC m=+1539.460012330" Mar 19 17:05:28 crc kubenswrapper[4918]: I0319 17:05:28.211872 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:05:28 crc kubenswrapper[4918]: I0319 17:05:28.212274 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:05:28 crc kubenswrapper[4918]: I0319 17:05:28.260918 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg"] Mar 19 17:05:28 crc kubenswrapper[4918]: I0319 17:05:28.363226 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" event={"ID":"0a3d3e76-de48-44f2-a34e-021196a21f5b","Type":"ContainerStarted","Data":"96e4622e917ce7dad025d8d5fbab04e4fd805bbefebaef1daf1ee4e373aad50c"} Mar 19 17:05:29 crc kubenswrapper[4918]: I0319 17:05:29.375017 4918 generic.go:334] "Generic (PLEG): container finished" podID="5cf3eb1c-8f65-4460-8283-dcdbe5d51e50" containerID="8a76235a4c8dc28d930bbde142db1c543124492c040b5c741caddbd756dcf162" exitCode=0 Mar 19 17:05:29 crc kubenswrapper[4918]: I0319 17:05:29.375100 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50","Type":"ContainerDied","Data":"8a76235a4c8dc28d930bbde142db1c543124492c040b5c741caddbd756dcf162"} Mar 19 17:05:29 crc kubenswrapper[4918]: I0319 17:05:29.377752 4918 generic.go:334] "Generic (PLEG): container finished" podID="025d722c-5115-4aae-bebd-3942f7da690d" containerID="9076b62cf6edd0836c937d2b17c97df9da431a990feced676f64b6936e0e24c6" exitCode=0 Mar 19 17:05:29 crc kubenswrapper[4918]: I0319 17:05:29.377851 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"025d722c-5115-4aae-bebd-3942f7da690d","Type":"ContainerDied","Data":"9076b62cf6edd0836c937d2b17c97df9da431a990feced676f64b6936e0e24c6"} Mar 19 17:05:30 crc kubenswrapper[4918]: I0319 17:05:30.409149 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5cf3eb1c-8f65-4460-8283-dcdbe5d51e50","Type":"ContainerStarted","Data":"96bee637bb3068545d14ac5a387d0af435ae6b035bdd40f364bcc56b063c9ade"} Mar 19 17:05:30 crc kubenswrapper[4918]: I0319 17:05:30.411344 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"025d722c-5115-4aae-bebd-3942f7da690d","Type":"ContainerStarted","Data":"0219ef96f8aadec7604ee955b63a766b222530fdc811a3608ffcb46117d38ee5"} Mar 19 17:05:30 crc kubenswrapper[4918]: I0319 17:05:30.412090 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:05:30 crc kubenswrapper[4918]: I0319 17:05:30.412662 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 17:05:30 crc kubenswrapper[4918]: I0319 17:05:30.442979 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=45.442960051 podStartE2EDuration="45.442960051s" podCreationTimestamp="2026-03-19 17:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:05:30.441044359 +0000 UTC m=+1542.563243607" watchObservedRunningTime="2026-03-19 17:05:30.442960051 +0000 UTC m=+1542.565159299" Mar 19 17:05:30 crc kubenswrapper[4918]: I0319 17:05:30.481412 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.481395471 podStartE2EDuration="45.481395471s" podCreationTimestamp="2026-03-19 17:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:05:30.477046993 +0000 UTC m=+1542.599246251" watchObservedRunningTime="2026-03-19 17:05:30.481395471 +0000 UTC m=+1542.603594719" Mar 19 17:05:40 crc kubenswrapper[4918]: I0319 17:05:40.535371 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" event={"ID":"0a3d3e76-de48-44f2-a34e-021196a21f5b","Type":"ContainerStarted","Data":"95e65d5f4eebe7c951deb8dd97bf62538ea0f5070d7572aedf59c04a7a907d6b"} Mar 19 17:05:40 crc kubenswrapper[4918]: I0319 17:05:40.568802 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" podStartSLOduration=4.394677177 podStartE2EDuration="15.568774876s" podCreationTimestamp="2026-03-19 17:05:25 +0000 UTC" firstStartedPulling="2026-03-19 17:05:28.270601582 +0000 UTC m=+1540.392800830" lastFinishedPulling="2026-03-19 17:05:39.444699281 +0000 UTC m=+1551.566898529" observedRunningTime="2026-03-19 17:05:40.551911555 +0000 UTC m=+1552.674110803" watchObservedRunningTime="2026-03-19 17:05:40.568774876 +0000 UTC m=+1552.690974154" Mar 19 17:05:46 crc kubenswrapper[4918]: I0319 17:05:46.196936 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 17:05:46 crc kubenswrapper[4918]: I0319 17:05:46.218737 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 17:05:50 crc kubenswrapper[4918]: I0319 17:05:50.633821 4918 generic.go:334] "Generic (PLEG): container finished" podID="0a3d3e76-de48-44f2-a34e-021196a21f5b" containerID="95e65d5f4eebe7c951deb8dd97bf62538ea0f5070d7572aedf59c04a7a907d6b" exitCode=0 Mar 19 17:05:50 crc kubenswrapper[4918]: I0319 17:05:50.633913 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" event={"ID":"0a3d3e76-de48-44f2-a34e-021196a21f5b","Type":"ContainerDied","Data":"95e65d5f4eebe7c951deb8dd97bf62538ea0f5070d7572aedf59c04a7a907d6b"} Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.277100 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.451018 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk2x8\" (UniqueName: \"kubernetes.io/projected/0a3d3e76-de48-44f2-a34e-021196a21f5b-kube-api-access-hk2x8\") pod \"0a3d3e76-de48-44f2-a34e-021196a21f5b\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.451448 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-repo-setup-combined-ca-bundle\") pod \"0a3d3e76-de48-44f2-a34e-021196a21f5b\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.451508 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-ssh-key-openstack-edpm-ipam\") pod \"0a3d3e76-de48-44f2-a34e-021196a21f5b\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.451711 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-inventory\") pod \"0a3d3e76-de48-44f2-a34e-021196a21f5b\" (UID: \"0a3d3e76-de48-44f2-a34e-021196a21f5b\") " Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.457427 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0a3d3e76-de48-44f2-a34e-021196a21f5b" (UID: "0a3d3e76-de48-44f2-a34e-021196a21f5b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.457905 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3d3e76-de48-44f2-a34e-021196a21f5b-kube-api-access-hk2x8" (OuterVolumeSpecName: "kube-api-access-hk2x8") pod "0a3d3e76-de48-44f2-a34e-021196a21f5b" (UID: "0a3d3e76-de48-44f2-a34e-021196a21f5b"). InnerVolumeSpecName "kube-api-access-hk2x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.497553 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0a3d3e76-de48-44f2-a34e-021196a21f5b" (UID: "0a3d3e76-de48-44f2-a34e-021196a21f5b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.502698 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-inventory" (OuterVolumeSpecName: "inventory") pod "0a3d3e76-de48-44f2-a34e-021196a21f5b" (UID: "0a3d3e76-de48-44f2-a34e-021196a21f5b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.553731 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.553789 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk2x8\" (UniqueName: \"kubernetes.io/projected/0a3d3e76-de48-44f2-a34e-021196a21f5b-kube-api-access-hk2x8\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.553800 4918 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.553809 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a3d3e76-de48-44f2-a34e-021196a21f5b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.653666 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" event={"ID":"0a3d3e76-de48-44f2-a34e-021196a21f5b","Type":"ContainerDied","Data":"96e4622e917ce7dad025d8d5fbab04e4fd805bbefebaef1daf1ee4e373aad50c"} Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.653717 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96e4622e917ce7dad025d8d5fbab04e4fd805bbefebaef1daf1ee4e373aad50c" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.653750 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.747539 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx"] Mar 19 17:05:52 crc kubenswrapper[4918]: E0319 17:05:52.748103 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3d3e76-de48-44f2-a34e-021196a21f5b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.748126 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3d3e76-de48-44f2-a34e-021196a21f5b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.748403 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3d3e76-de48-44f2-a34e-021196a21f5b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.749338 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.751295 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.751502 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.752078 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.752795 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.762499 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx"] Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.860340 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p978\" (UniqueName: \"kubernetes.io/projected/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-kube-api-access-7p978\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bkjnx\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.860386 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bkjnx\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.860475 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bkjnx\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.963635 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p978\" (UniqueName: \"kubernetes.io/projected/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-kube-api-access-7p978\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bkjnx\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.963927 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bkjnx\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.964138 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bkjnx\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.968512 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bkjnx\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.968703 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bkjnx\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:52 crc kubenswrapper[4918]: I0319 17:05:52.980204 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p978\" (UniqueName: \"kubernetes.io/projected/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-kube-api-access-7p978\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bkjnx\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:53 crc kubenswrapper[4918]: I0319 17:05:53.105060 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:53 crc kubenswrapper[4918]: I0319 17:05:53.664763 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx"] Mar 19 17:05:54 crc kubenswrapper[4918]: I0319 17:05:54.674701 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" event={"ID":"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7","Type":"ContainerStarted","Data":"c50b5a8579a0e704323e2ba10692a52c526c924b03911e3190b8d66553a69a56"} Mar 19 17:05:54 crc kubenswrapper[4918]: I0319 17:05:54.675084 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" event={"ID":"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7","Type":"ContainerStarted","Data":"2806d766613cdb5e8bebba08976f3736b56247cc068d73287a3d10ce7273acc1"} Mar 19 17:05:54 crc kubenswrapper[4918]: I0319 17:05:54.695842 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" podStartSLOduration=2.225990494 podStartE2EDuration="2.695818168s" podCreationTimestamp="2026-03-19 17:05:52 +0000 UTC" firstStartedPulling="2026-03-19 17:05:53.663627003 +0000 UTC m=+1565.785826251" lastFinishedPulling="2026-03-19 17:05:54.133454677 +0000 UTC m=+1566.255653925" observedRunningTime="2026-03-19 17:05:54.690506603 +0000 UTC m=+1566.812705851" watchObservedRunningTime="2026-03-19 17:05:54.695818168 +0000 UTC m=+1566.818017426" Mar 19 17:05:57 crc kubenswrapper[4918]: I0319 17:05:57.707123 4918 generic.go:334] "Generic (PLEG): container finished" podID="d97e8bcc-4a1e-45e6-88ef-3013c02d37a7" containerID="c50b5a8579a0e704323e2ba10692a52c526c924b03911e3190b8d66553a69a56" exitCode=0 Mar 19 17:05:57 crc kubenswrapper[4918]: I0319 17:05:57.708000 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" event={"ID":"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7","Type":"ContainerDied","Data":"c50b5a8579a0e704323e2ba10692a52c526c924b03911e3190b8d66553a69a56"} Mar 19 17:05:58 crc kubenswrapper[4918]: I0319 17:05:58.211833 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:05:58 crc kubenswrapper[4918]: I0319 17:05:58.212203 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:05:58 crc kubenswrapper[4918]: I0319 17:05:58.212255 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 17:05:58 crc kubenswrapper[4918]: I0319 17:05:58.213104 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 17:05:58 crc kubenswrapper[4918]: I0319 17:05:58.213166 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" gracePeriod=600 Mar 19 17:05:58 crc kubenswrapper[4918]: E0319 17:05:58.350662 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:05:58 crc kubenswrapper[4918]: I0319 17:05:58.703964 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Mar 19 17:05:58 crc kubenswrapper[4918]: I0319 17:05:58.755029 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" exitCode=0 Mar 19 17:05:58 crc kubenswrapper[4918]: I0319 17:05:58.755447 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440"} Mar 19 17:05:58 crc kubenswrapper[4918]: I0319 17:05:58.755482 4918 scope.go:117] "RemoveContainer" containerID="c06f60b9e3990852ac6fc7b59da3fe3cda8e2a2ae81b8e586f6da8fc956569f8" Mar 19 17:05:58 crc kubenswrapper[4918]: I0319 17:05:58.756145 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:05:58 crc kubenswrapper[4918]: E0319 17:05:58.756387 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.568790 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.705838 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-ssh-key-openstack-edpm-ipam\") pod \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.705996 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-inventory\") pod \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.706123 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p978\" (UniqueName: \"kubernetes.io/projected/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-kube-api-access-7p978\") pod \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\" (UID: \"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7\") " Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.711371 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-kube-api-access-7p978" (OuterVolumeSpecName: "kube-api-access-7p978") pod "d97e8bcc-4a1e-45e6-88ef-3013c02d37a7" (UID: "d97e8bcc-4a1e-45e6-88ef-3013c02d37a7"). InnerVolumeSpecName "kube-api-access-7p978". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.743922 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d97e8bcc-4a1e-45e6-88ef-3013c02d37a7" (UID: "d97e8bcc-4a1e-45e6-88ef-3013c02d37a7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.751856 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-inventory" (OuterVolumeSpecName: "inventory") pod "d97e8bcc-4a1e-45e6-88ef-3013c02d37a7" (UID: "d97e8bcc-4a1e-45e6-88ef-3013c02d37a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.788735 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" event={"ID":"d97e8bcc-4a1e-45e6-88ef-3013c02d37a7","Type":"ContainerDied","Data":"2806d766613cdb5e8bebba08976f3736b56247cc068d73287a3d10ce7273acc1"} Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.788775 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2806d766613cdb5e8bebba08976f3736b56247cc068d73287a3d10ce7273acc1" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.788849 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bkjnx" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.809410 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.809436 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p978\" (UniqueName: \"kubernetes.io/projected/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-kube-api-access-7p978\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.809470 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d97e8bcc-4a1e-45e6-88ef-3013c02d37a7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.847930 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm"] Mar 19 17:05:59 crc kubenswrapper[4918]: E0319 17:05:59.848353 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97e8bcc-4a1e-45e6-88ef-3013c02d37a7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.848370 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97e8bcc-4a1e-45e6-88ef-3013c02d37a7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.848588 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d97e8bcc-4a1e-45e6-88ef-3013c02d37a7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.849403 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.850958 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.851727 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.851764 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.851924 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.864357 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm"] Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.910839 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.911010 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.911038 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:05:59 crc kubenswrapper[4918]: I0319 17:05:59.911173 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crc4b\" (UniqueName: \"kubernetes.io/projected/63b801eb-41a1-4d19-933b-e098bedd9e93-kube-api-access-crc4b\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.013060 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.013109 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.013154 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crc4b\" (UniqueName: \"kubernetes.io/projected/63b801eb-41a1-4d19-933b-e098bedd9e93-kube-api-access-crc4b\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.013235 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.018463 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.022166 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.022747 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.035615 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crc4b\" (UniqueName: \"kubernetes.io/projected/63b801eb-41a1-4d19-933b-e098bedd9e93-kube-api-access-crc4b\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.129900 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565666-rjj5f"] Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.131724 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565666-rjj5f" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.134808 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.135064 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.135204 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.142287 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565666-rjj5f"] Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.178534 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.216301 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6nlz\" (UniqueName: \"kubernetes.io/projected/e8654396-60ef-431c-94f0-6db3b3f225d5-kube-api-access-w6nlz\") pod \"auto-csr-approver-29565666-rjj5f\" (UID: \"e8654396-60ef-431c-94f0-6db3b3f225d5\") " pod="openshift-infra/auto-csr-approver-29565666-rjj5f" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.318283 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6nlz\" (UniqueName: \"kubernetes.io/projected/e8654396-60ef-431c-94f0-6db3b3f225d5-kube-api-access-w6nlz\") pod \"auto-csr-approver-29565666-rjj5f\" (UID: \"e8654396-60ef-431c-94f0-6db3b3f225d5\") " pod="openshift-infra/auto-csr-approver-29565666-rjj5f" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.339298 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6nlz\" (UniqueName: \"kubernetes.io/projected/e8654396-60ef-431c-94f0-6db3b3f225d5-kube-api-access-w6nlz\") pod \"auto-csr-approver-29565666-rjj5f\" (UID: \"e8654396-60ef-431c-94f0-6db3b3f225d5\") " pod="openshift-infra/auto-csr-approver-29565666-rjj5f" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.457074 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565666-rjj5f" Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.827126 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm"] Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.839653 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" event={"ID":"63b801eb-41a1-4d19-933b-e098bedd9e93","Type":"ContainerStarted","Data":"59481f8a21b973cfee1adb145bcb23d7da5869311a96afb727317193dfeef259"} Mar 19 17:06:00 crc kubenswrapper[4918]: W0319 17:06:00.943669 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8654396_60ef_431c_94f0_6db3b3f225d5.slice/crio-a5bc05ffe27a377fcc116f5f557f3f45e2eaf678307c8c3a9eee7ed88d112ed9 WatchSource:0}: Error finding container a5bc05ffe27a377fcc116f5f557f3f45e2eaf678307c8c3a9eee7ed88d112ed9: Status 404 returned error can't find the container with id a5bc05ffe27a377fcc116f5f557f3f45e2eaf678307c8c3a9eee7ed88d112ed9 Mar 19 17:06:00 crc kubenswrapper[4918]: I0319 17:06:00.944113 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565666-rjj5f"] Mar 19 17:06:01 crc kubenswrapper[4918]: I0319 17:06:01.470568 4918 scope.go:117] "RemoveContainer" containerID="35b2acf61ba384f78d1234655cb72c1add3bcb1e328781ff1605f3ca7df67942" Mar 19 17:06:01 crc kubenswrapper[4918]: I0319 17:06:01.513353 4918 scope.go:117] "RemoveContainer" containerID="cefc5198a2e2c2da3134183bed4bf7df475e8c9f3b5156937dec3fce1c866751" Mar 19 17:06:01 crc kubenswrapper[4918]: I0319 17:06:01.545254 4918 scope.go:117] "RemoveContainer" containerID="e4a582f7ff62312f706d66c15c740eed5e7d7415357bc31780ce8208585348e8" Mar 19 17:06:01 crc kubenswrapper[4918]: I0319 17:06:01.593825 4918 scope.go:117] "RemoveContainer" containerID="1b523cb6cbf897ec1d265fffd48b68c6452297fc90343e2da6d37a9c393593a8" Mar 19 17:06:01 crc kubenswrapper[4918]: I0319 17:06:01.663256 4918 scope.go:117] "RemoveContainer" containerID="515624c2881c3bc1eb5ed0ac7af05a88fee84107fbab75f48617d5c7c8a9b307" Mar 19 17:06:01 crc kubenswrapper[4918]: I0319 17:06:01.724277 4918 scope.go:117] "RemoveContainer" containerID="04294a5366c846d1d2758c9922edf070292f663c14dd49015287db01f707b159" Mar 19 17:06:01 crc kubenswrapper[4918]: I0319 17:06:01.879577 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565666-rjj5f" event={"ID":"e8654396-60ef-431c-94f0-6db3b3f225d5","Type":"ContainerStarted","Data":"a5bc05ffe27a377fcc116f5f557f3f45e2eaf678307c8c3a9eee7ed88d112ed9"} Mar 19 17:06:02 crc kubenswrapper[4918]: I0319 17:06:02.904043 4918 generic.go:334] "Generic (PLEG): container finished" podID="e8654396-60ef-431c-94f0-6db3b3f225d5" containerID="77b896d7a09e0888c6a3c0619b9e66213942a8c3225f1ab9c8aad24c2775bc99" exitCode=0 Mar 19 17:06:02 crc kubenswrapper[4918]: I0319 17:06:02.904292 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565666-rjj5f" event={"ID":"e8654396-60ef-431c-94f0-6db3b3f225d5","Type":"ContainerDied","Data":"77b896d7a09e0888c6a3c0619b9e66213942a8c3225f1ab9c8aad24c2775bc99"} Mar 19 17:06:02 crc kubenswrapper[4918]: I0319 17:06:02.906207 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" event={"ID":"63b801eb-41a1-4d19-933b-e098bedd9e93","Type":"ContainerStarted","Data":"ee2ab37f5c21d14f105cad6e0ecf8628c1bfedf1b4e9a2407b733b036e4f01eb"} Mar 19 17:06:02 crc kubenswrapper[4918]: I0319 17:06:02.934990 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" podStartSLOduration=2.132690637 podStartE2EDuration="3.934971837s" podCreationTimestamp="2026-03-19 17:05:59 +0000 UTC" firstStartedPulling="2026-03-19 17:06:00.798276112 +0000 UTC m=+1572.920475360" lastFinishedPulling="2026-03-19 17:06:02.600557312 +0000 UTC m=+1574.722756560" observedRunningTime="2026-03-19 17:06:02.932761837 +0000 UTC m=+1575.054961085" watchObservedRunningTime="2026-03-19 17:06:02.934971837 +0000 UTC m=+1575.057171085" Mar 19 17:06:04 crc kubenswrapper[4918]: I0319 17:06:04.458978 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565666-rjj5f" Mar 19 17:06:04 crc kubenswrapper[4918]: I0319 17:06:04.511251 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6nlz\" (UniqueName: \"kubernetes.io/projected/e8654396-60ef-431c-94f0-6db3b3f225d5-kube-api-access-w6nlz\") pod \"e8654396-60ef-431c-94f0-6db3b3f225d5\" (UID: \"e8654396-60ef-431c-94f0-6db3b3f225d5\") " Mar 19 17:06:04 crc kubenswrapper[4918]: I0319 17:06:04.518684 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8654396-60ef-431c-94f0-6db3b3f225d5-kube-api-access-w6nlz" (OuterVolumeSpecName: "kube-api-access-w6nlz") pod "e8654396-60ef-431c-94f0-6db3b3f225d5" (UID: "e8654396-60ef-431c-94f0-6db3b3f225d5"). InnerVolumeSpecName "kube-api-access-w6nlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:06:04 crc kubenswrapper[4918]: I0319 17:06:04.613987 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6nlz\" (UniqueName: \"kubernetes.io/projected/e8654396-60ef-431c-94f0-6db3b3f225d5-kube-api-access-w6nlz\") on node \"crc\" DevicePath \"\"" Mar 19 17:06:04 crc kubenswrapper[4918]: I0319 17:06:04.926901 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565666-rjj5f" event={"ID":"e8654396-60ef-431c-94f0-6db3b3f225d5","Type":"ContainerDied","Data":"a5bc05ffe27a377fcc116f5f557f3f45e2eaf678307c8c3a9eee7ed88d112ed9"} Mar 19 17:06:04 crc kubenswrapper[4918]: I0319 17:06:04.927361 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5bc05ffe27a377fcc116f5f557f3f45e2eaf678307c8c3a9eee7ed88d112ed9" Mar 19 17:06:04 crc kubenswrapper[4918]: I0319 17:06:04.927028 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565666-rjj5f" Mar 19 17:06:05 crc kubenswrapper[4918]: I0319 17:06:05.541784 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565660-72kgl"] Mar 19 17:06:05 crc kubenswrapper[4918]: I0319 17:06:05.550852 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565660-72kgl"] Mar 19 17:06:06 crc kubenswrapper[4918]: I0319 17:06:06.613980 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b359fb-3f74-447c-a755-338a558fc429" path="/var/lib/kubelet/pods/09b359fb-3f74-447c-a755-338a558fc429/volumes" Mar 19 17:06:12 crc kubenswrapper[4918]: I0319 17:06:12.586683 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:06:12 crc kubenswrapper[4918]: E0319 17:06:12.587609 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:06:24 crc kubenswrapper[4918]: I0319 17:06:24.587654 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:06:24 crc kubenswrapper[4918]: E0319 17:06:24.588312 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:06:38 crc kubenswrapper[4918]: I0319 17:06:38.593791 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:06:38 crc kubenswrapper[4918]: E0319 17:06:38.594599 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:06:51 crc kubenswrapper[4918]: I0319 17:06:51.587132 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:06:51 crc kubenswrapper[4918]: E0319 17:06:51.588077 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:07:02 crc kubenswrapper[4918]: I0319 17:07:02.062495 4918 scope.go:117] "RemoveContainer" containerID="9c8e8272a1d763e99e059b1a8b10a04c8e73a5d42a2d6c1c37fd09bbc454dfb3" Mar 19 17:07:02 crc kubenswrapper[4918]: I0319 17:07:02.102398 4918 scope.go:117] "RemoveContainer" containerID="a8dc59d0b0f4ac0b4166c0beef03bebc11a9a94e95da637e2eac609f2af93328" Mar 19 17:07:02 crc kubenswrapper[4918]: I0319 17:07:02.137667 4918 scope.go:117] "RemoveContainer" containerID="34b99b4915e9314909894d08437b2da10e33044f4dae5091a867bf7c7638986a" Mar 19 17:07:02 crc kubenswrapper[4918]: I0319 17:07:02.193662 4918 scope.go:117] "RemoveContainer" containerID="9f32ad649e34ae83f5351e036be452e31fbf2e0ad374a247afd59c5c63a436a0" Mar 19 17:07:02 crc kubenswrapper[4918]: I0319 17:07:02.265220 4918 scope.go:117] "RemoveContainer" containerID="29246a219f079b04c662a83e96f11fef94512295380083535a11df21ee713c1f" Mar 19 17:07:06 crc kubenswrapper[4918]: I0319 17:07:06.586575 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:07:06 crc kubenswrapper[4918]: E0319 17:07:06.587386 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:07:17 crc kubenswrapper[4918]: I0319 17:07:17.586469 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:07:17 crc kubenswrapper[4918]: E0319 17:07:17.587186 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:07:30 crc kubenswrapper[4918]: I0319 17:07:30.586295 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:07:30 crc kubenswrapper[4918]: E0319 17:07:30.587188 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:07:41 crc kubenswrapper[4918]: I0319 17:07:41.586596 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:07:41 crc kubenswrapper[4918]: E0319 17:07:41.587241 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:07:56 crc kubenswrapper[4918]: I0319 17:07:56.586474 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:07:56 crc kubenswrapper[4918]: E0319 17:07:56.587362 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.154500 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565668-4vsh5"] Mar 19 17:08:00 crc kubenswrapper[4918]: E0319 17:08:00.155191 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8654396-60ef-431c-94f0-6db3b3f225d5" containerName="oc" Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.155203 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8654396-60ef-431c-94f0-6db3b3f225d5" containerName="oc" Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.155411 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8654396-60ef-431c-94f0-6db3b3f225d5" containerName="oc" Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.156132 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565668-4vsh5" Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.159394 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.169010 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.177246 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.189835 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565668-4vsh5"] Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.357095 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvffx\" (UniqueName: \"kubernetes.io/projected/1975654a-8c6b-4bbd-a044-451ca4fa9412-kube-api-access-jvffx\") pod \"auto-csr-approver-29565668-4vsh5\" (UID: \"1975654a-8c6b-4bbd-a044-451ca4fa9412\") " pod="openshift-infra/auto-csr-approver-29565668-4vsh5" Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.458593 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvffx\" (UniqueName: \"kubernetes.io/projected/1975654a-8c6b-4bbd-a044-451ca4fa9412-kube-api-access-jvffx\") pod \"auto-csr-approver-29565668-4vsh5\" (UID: \"1975654a-8c6b-4bbd-a044-451ca4fa9412\") " pod="openshift-infra/auto-csr-approver-29565668-4vsh5" Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.506179 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvffx\" (UniqueName: \"kubernetes.io/projected/1975654a-8c6b-4bbd-a044-451ca4fa9412-kube-api-access-jvffx\") pod \"auto-csr-approver-29565668-4vsh5\" (UID: \"1975654a-8c6b-4bbd-a044-451ca4fa9412\") " pod="openshift-infra/auto-csr-approver-29565668-4vsh5" Mar 19 17:08:00 crc kubenswrapper[4918]: I0319 17:08:00.776288 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565668-4vsh5" Mar 19 17:08:01 crc kubenswrapper[4918]: I0319 17:08:01.481765 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 17:08:01 crc kubenswrapper[4918]: I0319 17:08:01.485828 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565668-4vsh5"] Mar 19 17:08:02 crc kubenswrapper[4918]: I0319 17:08:02.112654 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565668-4vsh5" event={"ID":"1975654a-8c6b-4bbd-a044-451ca4fa9412","Type":"ContainerStarted","Data":"833580bce36387295b1d75e47f815326bbcb8e2065fb8be9ebab3e082f8cb5c1"} Mar 19 17:08:03 crc kubenswrapper[4918]: I0319 17:08:03.122587 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565668-4vsh5" event={"ID":"1975654a-8c6b-4bbd-a044-451ca4fa9412","Type":"ContainerStarted","Data":"2a81b9f96e4ae5a11ec1acd858dedd355d5aa7e037998aff8c498cbb31ab0917"} Mar 19 17:08:03 crc kubenswrapper[4918]: I0319 17:08:03.142351 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565668-4vsh5" podStartSLOduration=1.983572223 podStartE2EDuration="3.142328841s" podCreationTimestamp="2026-03-19 17:08:00 +0000 UTC" firstStartedPulling="2026-03-19 17:08:01.481508262 +0000 UTC m=+1693.603707510" lastFinishedPulling="2026-03-19 17:08:02.64026488 +0000 UTC m=+1694.762464128" observedRunningTime="2026-03-19 17:08:03.136964104 +0000 UTC m=+1695.259163352" watchObservedRunningTime="2026-03-19 17:08:03.142328841 +0000 UTC m=+1695.264528089" Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.139564 4918 generic.go:334] "Generic (PLEG): container finished" podID="1975654a-8c6b-4bbd-a044-451ca4fa9412" containerID="2a81b9f96e4ae5a11ec1acd858dedd355d5aa7e037998aff8c498cbb31ab0917" exitCode=0 Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.139611 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565668-4vsh5" event={"ID":"1975654a-8c6b-4bbd-a044-451ca4fa9412","Type":"ContainerDied","Data":"2a81b9f96e4ae5a11ec1acd858dedd355d5aa7e037998aff8c498cbb31ab0917"} Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.398114 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mk2wv"] Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.400894 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.409178 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mk2wv"] Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.540177 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-catalog-content\") pod \"redhat-operators-mk2wv\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.540655 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-utilities\") pod \"redhat-operators-mk2wv\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.540700 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24prj\" (UniqueName: \"kubernetes.io/projected/57bf540f-7464-4e4d-a1c6-5bca37c44763-kube-api-access-24prj\") pod \"redhat-operators-mk2wv\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.643275 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-utilities\") pod \"redhat-operators-mk2wv\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.643365 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24prj\" (UniqueName: \"kubernetes.io/projected/57bf540f-7464-4e4d-a1c6-5bca37c44763-kube-api-access-24prj\") pod \"redhat-operators-mk2wv\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.643455 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-catalog-content\") pod \"redhat-operators-mk2wv\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.643805 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-utilities\") pod \"redhat-operators-mk2wv\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.643935 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-catalog-content\") pod \"redhat-operators-mk2wv\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.663018 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24prj\" (UniqueName: \"kubernetes.io/projected/57bf540f-7464-4e4d-a1c6-5bca37c44763-kube-api-access-24prj\") pod \"redhat-operators-mk2wv\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:04 crc kubenswrapper[4918]: I0319 17:08:04.730716 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:05 crc kubenswrapper[4918]: W0319 17:08:05.240699 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57bf540f_7464_4e4d_a1c6_5bca37c44763.slice/crio-7dbdf7798b88cc857a615b0d2325fb5f9dd45f871a254159e139310d5a7e18f2 WatchSource:0}: Error finding container 7dbdf7798b88cc857a615b0d2325fb5f9dd45f871a254159e139310d5a7e18f2: Status 404 returned error can't find the container with id 7dbdf7798b88cc857a615b0d2325fb5f9dd45f871a254159e139310d5a7e18f2 Mar 19 17:08:05 crc kubenswrapper[4918]: I0319 17:08:05.255147 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mk2wv"] Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.026414 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565668-4vsh5" Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.078583 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvffx\" (UniqueName: \"kubernetes.io/projected/1975654a-8c6b-4bbd-a044-451ca4fa9412-kube-api-access-jvffx\") pod \"1975654a-8c6b-4bbd-a044-451ca4fa9412\" (UID: \"1975654a-8c6b-4bbd-a044-451ca4fa9412\") " Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.101732 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1975654a-8c6b-4bbd-a044-451ca4fa9412-kube-api-access-jvffx" (OuterVolumeSpecName: "kube-api-access-jvffx") pod "1975654a-8c6b-4bbd-a044-451ca4fa9412" (UID: "1975654a-8c6b-4bbd-a044-451ca4fa9412"). InnerVolumeSpecName "kube-api-access-jvffx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.161592 4918 generic.go:334] "Generic (PLEG): container finished" podID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerID="1ccfe63ede3ed1d59fcb8873455ac7477206df64a11d5dbde96b3bd218e190e7" exitCode=0 Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.161674 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2wv" event={"ID":"57bf540f-7464-4e4d-a1c6-5bca37c44763","Type":"ContainerDied","Data":"1ccfe63ede3ed1d59fcb8873455ac7477206df64a11d5dbde96b3bd218e190e7"} Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.161701 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2wv" event={"ID":"57bf540f-7464-4e4d-a1c6-5bca37c44763","Type":"ContainerStarted","Data":"7dbdf7798b88cc857a615b0d2325fb5f9dd45f871a254159e139310d5a7e18f2"} Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.164243 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565668-4vsh5" event={"ID":"1975654a-8c6b-4bbd-a044-451ca4fa9412","Type":"ContainerDied","Data":"833580bce36387295b1d75e47f815326bbcb8e2065fb8be9ebab3e082f8cb5c1"} Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.164278 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="833580bce36387295b1d75e47f815326bbcb8e2065fb8be9ebab3e082f8cb5c1" Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.164368 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565668-4vsh5" Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.182770 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvffx\" (UniqueName: \"kubernetes.io/projected/1975654a-8c6b-4bbd-a044-451ca4fa9412-kube-api-access-jvffx\") on node \"crc\" DevicePath \"\"" Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.214301 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565662-dsxdv"] Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.229727 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565662-dsxdv"] Mar 19 17:08:06 crc kubenswrapper[4918]: I0319 17:08:06.597800 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5564c9bc-c959-4612-a27b-3b3272ca0bf3" path="/var/lib/kubelet/pods/5564c9bc-c959-4612-a27b-3b3272ca0bf3/volumes" Mar 19 17:08:08 crc kubenswrapper[4918]: I0319 17:08:08.183914 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2wv" event={"ID":"57bf540f-7464-4e4d-a1c6-5bca37c44763","Type":"ContainerStarted","Data":"fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0"} Mar 19 17:08:11 crc kubenswrapper[4918]: I0319 17:08:11.587150 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:08:11 crc kubenswrapper[4918]: E0319 17:08:11.587955 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:08:14 crc kubenswrapper[4918]: I0319 17:08:14.240935 4918 generic.go:334] "Generic (PLEG): container finished" podID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerID="fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0" exitCode=0 Mar 19 17:08:14 crc kubenswrapper[4918]: I0319 17:08:14.241024 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2wv" event={"ID":"57bf540f-7464-4e4d-a1c6-5bca37c44763","Type":"ContainerDied","Data":"fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0"} Mar 19 17:08:14 crc kubenswrapper[4918]: E0319 17:08:14.459496 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad: reading manifest sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 502 Bad Gateway" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad" Mar 19 17:08:14 crc kubenswrapper[4918]: E0319 17:08:14.459776 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad,Command:[/bin/opm],Args:[serve /extracted-catalog/catalog --cache-dir=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOMEMLIMIT,Value:30MiB,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{31457280 0} {} 30Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-24prj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mk2wv_openshift-marketplace(57bf540f-7464-4e4d-a1c6-5bca37c44763): ErrImagePull: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad: reading manifest sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 502 Bad Gateway" logger="UnhandledError" Mar 19 17:08:14 crc kubenswrapper[4918]: E0319 17:08:14.461022 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad: reading manifest sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 502 Bad Gateway\"" pod="openshift-marketplace/redhat-operators-mk2wv" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" Mar 19 17:08:26 crc kubenswrapper[4918]: I0319 17:08:26.588394 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:08:26 crc kubenswrapper[4918]: E0319 17:08:26.589769 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:08:27 crc kubenswrapper[4918]: I0319 17:08:27.367643 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2wv" event={"ID":"57bf540f-7464-4e4d-a1c6-5bca37c44763","Type":"ContainerStarted","Data":"f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f"} Mar 19 17:08:27 crc kubenswrapper[4918]: I0319 17:08:27.389551 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mk2wv" podStartSLOduration=2.544816385 podStartE2EDuration="23.38951166s" podCreationTimestamp="2026-03-19 17:08:04 +0000 UTC" firstStartedPulling="2026-03-19 17:08:06.163867411 +0000 UTC m=+1698.286066659" lastFinishedPulling="2026-03-19 17:08:27.008562686 +0000 UTC m=+1719.130761934" observedRunningTime="2026-03-19 17:08:27.388214815 +0000 UTC m=+1719.510414063" watchObservedRunningTime="2026-03-19 17:08:27.38951166 +0000 UTC m=+1719.511710908" Mar 19 17:08:34 crc kubenswrapper[4918]: I0319 17:08:34.731153 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:34 crc kubenswrapper[4918]: I0319 17:08:34.731746 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:35 crc kubenswrapper[4918]: I0319 17:08:35.798593 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mk2wv" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerName="registry-server" probeResult="failure" output=< Mar 19 17:08:35 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 17:08:35 crc kubenswrapper[4918]: > Mar 19 17:08:38 crc kubenswrapper[4918]: I0319 17:08:38.596241 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:08:38 crc kubenswrapper[4918]: E0319 17:08:38.596844 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:08:45 crc kubenswrapper[4918]: I0319 17:08:45.784113 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mk2wv" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerName="registry-server" probeResult="failure" output=< Mar 19 17:08:45 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 17:08:45 crc kubenswrapper[4918]: > Mar 19 17:08:50 crc kubenswrapper[4918]: I0319 17:08:50.586514 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:08:50 crc kubenswrapper[4918]: E0319 17:08:50.587360 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:08:54 crc kubenswrapper[4918]: I0319 17:08:54.796788 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:54 crc kubenswrapper[4918]: I0319 17:08:54.856018 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:55 crc kubenswrapper[4918]: I0319 17:08:55.034621 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mk2wv"] Mar 19 17:08:56 crc kubenswrapper[4918]: I0319 17:08:56.649249 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mk2wv" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerName="registry-server" containerID="cri-o://f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f" gracePeriod=2 Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.653590 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.659064 4918 generic.go:334] "Generic (PLEG): container finished" podID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerID="f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f" exitCode=0 Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.659110 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2wv" event={"ID":"57bf540f-7464-4e4d-a1c6-5bca37c44763","Type":"ContainerDied","Data":"f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f"} Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.659135 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mk2wv" event={"ID":"57bf540f-7464-4e4d-a1c6-5bca37c44763","Type":"ContainerDied","Data":"7dbdf7798b88cc857a615b0d2325fb5f9dd45f871a254159e139310d5a7e18f2"} Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.659155 4918 scope.go:117] "RemoveContainer" containerID="f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.659294 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mk2wv" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.686308 4918 scope.go:117] "RemoveContainer" containerID="fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.713675 4918 scope.go:117] "RemoveContainer" containerID="1ccfe63ede3ed1d59fcb8873455ac7477206df64a11d5dbde96b3bd218e190e7" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.753191 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-catalog-content\") pod \"57bf540f-7464-4e4d-a1c6-5bca37c44763\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.753533 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-utilities\") pod \"57bf540f-7464-4e4d-a1c6-5bca37c44763\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.753646 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24prj\" (UniqueName: \"kubernetes.io/projected/57bf540f-7464-4e4d-a1c6-5bca37c44763-kube-api-access-24prj\") pod \"57bf540f-7464-4e4d-a1c6-5bca37c44763\" (UID: \"57bf540f-7464-4e4d-a1c6-5bca37c44763\") " Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.755078 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-utilities" (OuterVolumeSpecName: "utilities") pod "57bf540f-7464-4e4d-a1c6-5bca37c44763" (UID: "57bf540f-7464-4e4d-a1c6-5bca37c44763"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.760710 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57bf540f-7464-4e4d-a1c6-5bca37c44763-kube-api-access-24prj" (OuterVolumeSpecName: "kube-api-access-24prj") pod "57bf540f-7464-4e4d-a1c6-5bca37c44763" (UID: "57bf540f-7464-4e4d-a1c6-5bca37c44763"). InnerVolumeSpecName "kube-api-access-24prj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.775638 4918 scope.go:117] "RemoveContainer" containerID="f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f" Mar 19 17:08:57 crc kubenswrapper[4918]: E0319 17:08:57.776130 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f\": container with ID starting with f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f not found: ID does not exist" containerID="f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.776173 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f"} err="failed to get container status \"f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f\": rpc error: code = NotFound desc = could not find container \"f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f\": container with ID starting with f162b2f6cb7ac9189d874d4994cc98ef0ca1f6d5496caafe340fbdae68637a0f not found: ID does not exist" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.776202 4918 scope.go:117] "RemoveContainer" containerID="fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0" Mar 19 17:08:57 crc kubenswrapper[4918]: E0319 17:08:57.776553 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0\": container with ID starting with fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0 not found: ID does not exist" containerID="fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.776582 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0"} err="failed to get container status \"fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0\": rpc error: code = NotFound desc = could not find container \"fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0\": container with ID starting with fdda958edbb203d392969f355d3c6164c05b82a80538d85ad5754b237bfcb9b0 not found: ID does not exist" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.776596 4918 scope.go:117] "RemoveContainer" containerID="1ccfe63ede3ed1d59fcb8873455ac7477206df64a11d5dbde96b3bd218e190e7" Mar 19 17:08:57 crc kubenswrapper[4918]: E0319 17:08:57.776861 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ccfe63ede3ed1d59fcb8873455ac7477206df64a11d5dbde96b3bd218e190e7\": container with ID starting with 1ccfe63ede3ed1d59fcb8873455ac7477206df64a11d5dbde96b3bd218e190e7 not found: ID does not exist" containerID="1ccfe63ede3ed1d59fcb8873455ac7477206df64a11d5dbde96b3bd218e190e7" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.776892 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccfe63ede3ed1d59fcb8873455ac7477206df64a11d5dbde96b3bd218e190e7"} err="failed to get container status \"1ccfe63ede3ed1d59fcb8873455ac7477206df64a11d5dbde96b3bd218e190e7\": rpc error: code = NotFound desc = could not find container \"1ccfe63ede3ed1d59fcb8873455ac7477206df64a11d5dbde96b3bd218e190e7\": container with ID starting with 1ccfe63ede3ed1d59fcb8873455ac7477206df64a11d5dbde96b3bd218e190e7 not found: ID does not exist" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.856172 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.856209 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24prj\" (UniqueName: \"kubernetes.io/projected/57bf540f-7464-4e4d-a1c6-5bca37c44763-kube-api-access-24prj\") on node \"crc\" DevicePath \"\"" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.884576 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57bf540f-7464-4e4d-a1c6-5bca37c44763" (UID: "57bf540f-7464-4e4d-a1c6-5bca37c44763"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:08:57 crc kubenswrapper[4918]: I0319 17:08:57.957840 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57bf540f-7464-4e4d-a1c6-5bca37c44763-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:08:58 crc kubenswrapper[4918]: I0319 17:08:58.004591 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mk2wv"] Mar 19 17:08:58 crc kubenswrapper[4918]: I0319 17:08:58.015285 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mk2wv"] Mar 19 17:08:58 crc kubenswrapper[4918]: I0319 17:08:58.596811 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" path="/var/lib/kubelet/pods/57bf540f-7464-4e4d-a1c6-5bca37c44763/volumes" Mar 19 17:09:02 crc kubenswrapper[4918]: I0319 17:09:02.408799 4918 scope.go:117] "RemoveContainer" containerID="1eacb64c055a23176e5f6fca80a7d73daed2863297d92209894f71ae890aebb4" Mar 19 17:09:02 crc kubenswrapper[4918]: I0319 17:09:02.436114 4918 scope.go:117] "RemoveContainer" containerID="3f06b127cd2b4cef4cae2f38e797f7424a225679cd943c480ce30ea4f6303371" Mar 19 17:09:02 crc kubenswrapper[4918]: I0319 17:09:02.476077 4918 scope.go:117] "RemoveContainer" containerID="515c7765c77bdaf82385e5a44b02bdef063c98562ab2d54635831c5a862d2327" Mar 19 17:09:02 crc kubenswrapper[4918]: I0319 17:09:02.518035 4918 scope.go:117] "RemoveContainer" containerID="673a6dc80e67c739f7500e9491f6fb7d52f709175c55bc20908132441374cf43" Mar 19 17:09:03 crc kubenswrapper[4918]: I0319 17:09:03.745659 4918 generic.go:334] "Generic (PLEG): container finished" podID="63b801eb-41a1-4d19-933b-e098bedd9e93" containerID="ee2ab37f5c21d14f105cad6e0ecf8628c1bfedf1b4e9a2407b733b036e4f01eb" exitCode=0 Mar 19 17:09:03 crc kubenswrapper[4918]: I0319 17:09:03.746016 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" event={"ID":"63b801eb-41a1-4d19-933b-e098bedd9e93","Type":"ContainerDied","Data":"ee2ab37f5c21d14f105cad6e0ecf8628c1bfedf1b4e9a2407b733b036e4f01eb"} Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.586175 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:09:05 crc kubenswrapper[4918]: E0319 17:09:05.586917 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.763244 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.766135 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" event={"ID":"63b801eb-41a1-4d19-933b-e098bedd9e93","Type":"ContainerDied","Data":"59481f8a21b973cfee1adb145bcb23d7da5869311a96afb727317193dfeef259"} Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.766180 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59481f8a21b973cfee1adb145bcb23d7da5869311a96afb727317193dfeef259" Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.766209 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm" Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.850213 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crc4b\" (UniqueName: \"kubernetes.io/projected/63b801eb-41a1-4d19-933b-e098bedd9e93-kube-api-access-crc4b\") pod \"63b801eb-41a1-4d19-933b-e098bedd9e93\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.850439 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-bootstrap-combined-ca-bundle\") pod \"63b801eb-41a1-4d19-933b-e098bedd9e93\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.850459 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-ssh-key-openstack-edpm-ipam\") pod \"63b801eb-41a1-4d19-933b-e098bedd9e93\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.850544 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-inventory\") pod \"63b801eb-41a1-4d19-933b-e098bedd9e93\" (UID: \"63b801eb-41a1-4d19-933b-e098bedd9e93\") " Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.860658 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63b801eb-41a1-4d19-933b-e098bedd9e93-kube-api-access-crc4b" (OuterVolumeSpecName: "kube-api-access-crc4b") pod "63b801eb-41a1-4d19-933b-e098bedd9e93" (UID: "63b801eb-41a1-4d19-933b-e098bedd9e93"). InnerVolumeSpecName "kube-api-access-crc4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.876912 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "63b801eb-41a1-4d19-933b-e098bedd9e93" (UID: "63b801eb-41a1-4d19-933b-e098bedd9e93"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.909123 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-inventory" (OuterVolumeSpecName: "inventory") pod "63b801eb-41a1-4d19-933b-e098bedd9e93" (UID: "63b801eb-41a1-4d19-933b-e098bedd9e93"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.952916 4918 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.953163 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.953239 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crc4b\" (UniqueName: \"kubernetes.io/projected/63b801eb-41a1-4d19-933b-e098bedd9e93-kube-api-access-crc4b\") on node \"crc\" DevicePath \"\"" Mar 19 17:09:05 crc kubenswrapper[4918]: I0319 17:09:05.966126 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "63b801eb-41a1-4d19-933b-e098bedd9e93" (UID: "63b801eb-41a1-4d19-933b-e098bedd9e93"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.054907 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63b801eb-41a1-4d19-933b-e098bedd9e93-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.895033 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt"] Mar 19 17:09:06 crc kubenswrapper[4918]: E0319 17:09:06.895782 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerName="extract-utilities" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.895796 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerName="extract-utilities" Mar 19 17:09:06 crc kubenswrapper[4918]: E0319 17:09:06.895822 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerName="extract-content" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.895828 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerName="extract-content" Mar 19 17:09:06 crc kubenswrapper[4918]: E0319 17:09:06.895843 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1975654a-8c6b-4bbd-a044-451ca4fa9412" containerName="oc" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.895849 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="1975654a-8c6b-4bbd-a044-451ca4fa9412" containerName="oc" Mar 19 17:09:06 crc kubenswrapper[4918]: E0319 17:09:06.895862 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b801eb-41a1-4d19-933b-e098bedd9e93" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.895868 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b801eb-41a1-4d19-933b-e098bedd9e93" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 17:09:06 crc kubenswrapper[4918]: E0319 17:09:06.895881 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerName="registry-server" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.895887 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerName="registry-server" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.896065 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="1975654a-8c6b-4bbd-a044-451ca4fa9412" containerName="oc" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.896087 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b801eb-41a1-4d19-933b-e098bedd9e93" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.896097 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="57bf540f-7464-4e4d-a1c6-5bca37c44763" containerName="registry-server" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.896823 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.899475 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.899708 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.899726 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.901728 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:09:06 crc kubenswrapper[4918]: I0319 17:09:06.934835 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt"] Mar 19 17:09:07 crc kubenswrapper[4918]: I0319 17:09:07.073548 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-szqtt\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:09:07 crc kubenswrapper[4918]: I0319 17:09:07.073750 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-szqtt\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:09:07 crc kubenswrapper[4918]: I0319 17:09:07.073842 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4sgr\" (UniqueName: \"kubernetes.io/projected/04af1485-802e-4821-a499-683301ee97ff-kube-api-access-m4sgr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-szqtt\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:09:07 crc kubenswrapper[4918]: I0319 17:09:07.175161 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-szqtt\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:09:07 crc kubenswrapper[4918]: I0319 17:09:07.175363 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-szqtt\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:09:07 crc kubenswrapper[4918]: I0319 17:09:07.175405 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4sgr\" (UniqueName: \"kubernetes.io/projected/04af1485-802e-4821-a499-683301ee97ff-kube-api-access-m4sgr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-szqtt\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:09:07 crc kubenswrapper[4918]: I0319 17:09:07.180699 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-szqtt\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:09:07 crc kubenswrapper[4918]: I0319 17:09:07.180714 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-szqtt\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:09:07 crc kubenswrapper[4918]: I0319 17:09:07.196560 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4sgr\" (UniqueName: \"kubernetes.io/projected/04af1485-802e-4821-a499-683301ee97ff-kube-api-access-m4sgr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-szqtt\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:09:07 crc kubenswrapper[4918]: I0319 17:09:07.223256 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:09:07 crc kubenswrapper[4918]: I0319 17:09:07.790640 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt"] Mar 19 17:09:08 crc kubenswrapper[4918]: I0319 17:09:08.797315 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" event={"ID":"04af1485-802e-4821-a499-683301ee97ff","Type":"ContainerStarted","Data":"d0aa7f8a24dcfee4d290278121a9dc936d984e2cc462f9ae4c74ea2041626f30"} Mar 19 17:09:08 crc kubenswrapper[4918]: I0319 17:09:08.797661 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" event={"ID":"04af1485-802e-4821-a499-683301ee97ff","Type":"ContainerStarted","Data":"01469e9c951ce38cfced209937f67bc1c36f80f816cdc4ae86973e9501f63dc2"} Mar 19 17:09:08 crc kubenswrapper[4918]: I0319 17:09:08.819862 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" podStartSLOduration=2.401977773 podStartE2EDuration="2.819843298s" podCreationTimestamp="2026-03-19 17:09:06 +0000 UTC" firstStartedPulling="2026-03-19 17:09:07.797860946 +0000 UTC m=+1759.920060194" lastFinishedPulling="2026-03-19 17:09:08.215726471 +0000 UTC m=+1760.337925719" observedRunningTime="2026-03-19 17:09:08.817774421 +0000 UTC m=+1760.939973679" watchObservedRunningTime="2026-03-19 17:09:08.819843298 +0000 UTC m=+1760.942042546" Mar 19 17:09:14 crc kubenswrapper[4918]: I0319 17:09:14.040032 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-mhhrx"] Mar 19 17:09:14 crc kubenswrapper[4918]: I0319 17:09:14.050600 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b002-account-create-update-cq2hz"] Mar 19 17:09:14 crc kubenswrapper[4918]: I0319 17:09:14.061694 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-mhhrx"] Mar 19 17:09:14 crc kubenswrapper[4918]: I0319 17:09:14.070945 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b002-account-create-update-cq2hz"] Mar 19 17:09:14 crc kubenswrapper[4918]: I0319 17:09:14.598763 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313b84b1-8c9f-4a76-90dd-923d9ba8f469" path="/var/lib/kubelet/pods/313b84b1-8c9f-4a76-90dd-923d9ba8f469/volumes" Mar 19 17:09:14 crc kubenswrapper[4918]: I0319 17:09:14.599363 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b0d9347-5fad-4ca6-8adc-0850f537067b" path="/var/lib/kubelet/pods/3b0d9347-5fad-4ca6-8adc-0850f537067b/volumes" Mar 19 17:09:16 crc kubenswrapper[4918]: I0319 17:09:16.587597 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:09:16 crc kubenswrapper[4918]: E0319 17:09:16.588455 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.045514 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bp9zk"] Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.061768 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-eb42-account-create-update-h2lqq"] Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.076314 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4694-account-create-update-chqcv"] Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.100665 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bp9zk"] Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.125946 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-eb42-account-create-update-h2lqq"] Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.136290 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4694-account-create-update-chqcv"] Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.145538 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jmqtm"] Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.154155 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jmqtm"] Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.599997 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ddd71c6-2966-406d-af9e-122263ed9610" path="/var/lib/kubelet/pods/0ddd71c6-2966-406d-af9e-122263ed9610/volumes" Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.601582 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a" path="/var/lib/kubelet/pods/1b9e1a53-1fa7-4d1b-9c1b-91e8b44bf30a/volumes" Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.603297 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c38a283-f108-443b-a845-d378075a9881" path="/var/lib/kubelet/pods/9c38a283-f108-443b-a845-d378075a9881/volumes" Mar 19 17:09:20 crc kubenswrapper[4918]: I0319 17:09:20.604630 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e414f276-f48b-4efa-a0d5-c3bccaf6eb54" path="/var/lib/kubelet/pods/e414f276-f48b-4efa-a0d5-c3bccaf6eb54/volumes" Mar 19 17:09:28 crc kubenswrapper[4918]: I0319 17:09:28.594522 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:09:28 crc kubenswrapper[4918]: E0319 17:09:28.595425 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:09:36 crc kubenswrapper[4918]: I0319 17:09:36.047359 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bc7bt"] Mar 19 17:09:36 crc kubenswrapper[4918]: I0319 17:09:36.059666 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bc7bt"] Mar 19 17:09:36 crc kubenswrapper[4918]: I0319 17:09:36.597640 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9377958-db94-4c7d-bc29-37ca8135ba07" path="/var/lib/kubelet/pods/f9377958-db94-4c7d-bc29-37ca8135ba07/volumes" Mar 19 17:09:41 crc kubenswrapper[4918]: I0319 17:09:41.586269 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:09:41 crc kubenswrapper[4918]: E0319 17:09:41.587236 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:09:45 crc kubenswrapper[4918]: I0319 17:09:45.035793 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-zk5wz"] Mar 19 17:09:45 crc kubenswrapper[4918]: I0319 17:09:45.047829 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5103-account-create-update-t64cp"] Mar 19 17:09:45 crc kubenswrapper[4918]: I0319 17:09:45.059556 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-zk5wz"] Mar 19 17:09:45 crc kubenswrapper[4918]: I0319 17:09:45.069031 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5103-account-create-update-t64cp"] Mar 19 17:09:45 crc kubenswrapper[4918]: I0319 17:09:45.078847 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e0e3-account-create-update-gpntj"] Mar 19 17:09:45 crc kubenswrapper[4918]: I0319 17:09:45.092989 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e0e3-account-create-update-gpntj"] Mar 19 17:09:46 crc kubenswrapper[4918]: I0319 17:09:46.599388 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543581b9-3f29-4718-bf4b-a4eaa3fb4b39" path="/var/lib/kubelet/pods/543581b9-3f29-4718-bf4b-a4eaa3fb4b39/volumes" Mar 19 17:09:46 crc kubenswrapper[4918]: I0319 17:09:46.600592 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad8cc411-f838-439a-9993-e53b431dcd28" path="/var/lib/kubelet/pods/ad8cc411-f838-439a-9993-e53b431dcd28/volumes" Mar 19 17:09:46 crc kubenswrapper[4918]: I0319 17:09:46.601303 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ab1e31-bea9-4b23-899c-0b818c121f65" path="/var/lib/kubelet/pods/e9ab1e31-bea9-4b23-899c-0b818c121f65/volumes" Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.034536 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-da83-account-create-update-xc7wt"] Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.046967 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ea0b-account-create-update-l66jx"] Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.059184 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-da83-account-create-update-xc7wt"] Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.072469 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6x9dg"] Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.086339 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ea0b-account-create-update-l66jx"] Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.096103 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-g2mz7"] Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.105148 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6x9dg"] Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.114906 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-g2mz7"] Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.126685 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-54nhz"] Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.137395 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-v6c2z"] Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.146509 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-54nhz"] Mar 19 17:09:49 crc kubenswrapper[4918]: I0319 17:09:49.154914 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-v6c2z"] Mar 19 17:09:50 crc kubenswrapper[4918]: I0319 17:09:50.603094 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f77a167-549a-441f-b185-977ccb2195ab" path="/var/lib/kubelet/pods/0f77a167-549a-441f-b185-977ccb2195ab/volumes" Mar 19 17:09:50 crc kubenswrapper[4918]: I0319 17:09:50.603725 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c8c8a7-6bd3-4169-88ac-9a5838c526c2" path="/var/lib/kubelet/pods/86c8c8a7-6bd3-4169-88ac-9a5838c526c2/volumes" Mar 19 17:09:50 crc kubenswrapper[4918]: I0319 17:09:50.604578 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc559952-1f04-4a21-8415-c9c613c5b4d4" path="/var/lib/kubelet/pods/bc559952-1f04-4a21-8415-c9c613c5b4d4/volumes" Mar 19 17:09:50 crc kubenswrapper[4918]: I0319 17:09:50.605276 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c909d8df-c118-4ff6-8c06-c0d3f71be4cf" path="/var/lib/kubelet/pods/c909d8df-c118-4ff6-8c06-c0d3f71be4cf/volumes" Mar 19 17:09:50 crc kubenswrapper[4918]: I0319 17:09:50.605883 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dadbc2dd-325c-4390-9ea6-bc827cec049d" path="/var/lib/kubelet/pods/dadbc2dd-325c-4390-9ea6-bc827cec049d/volumes" Mar 19 17:09:50 crc kubenswrapper[4918]: I0319 17:09:50.606900 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda7a707-ab32-455f-8d42-bd371c95e9d2" path="/var/lib/kubelet/pods/fda7a707-ab32-455f-8d42-bd371c95e9d2/volumes" Mar 19 17:09:52 crc kubenswrapper[4918]: I0319 17:09:52.586415 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:09:52 crc kubenswrapper[4918]: E0319 17:09:52.586940 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:09:55 crc kubenswrapper[4918]: I0319 17:09:55.049753 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-6cdsd"] Mar 19 17:09:55 crc kubenswrapper[4918]: I0319 17:09:55.060331 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-6cdsd"] Mar 19 17:09:56 crc kubenswrapper[4918]: I0319 17:09:56.608141 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70226cc0-a6ae-4454-8e20-f85b06e2ee2d" path="/var/lib/kubelet/pods/70226cc0-a6ae-4454-8e20-f85b06e2ee2d/volumes" Mar 19 17:10:00 crc kubenswrapper[4918]: I0319 17:10:00.142578 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565670-lx56g"] Mar 19 17:10:00 crc kubenswrapper[4918]: I0319 17:10:00.145342 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565670-lx56g" Mar 19 17:10:00 crc kubenswrapper[4918]: I0319 17:10:00.148401 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:10:00 crc kubenswrapper[4918]: I0319 17:10:00.148790 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:10:00 crc kubenswrapper[4918]: I0319 17:10:00.149178 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:10:00 crc kubenswrapper[4918]: I0319 17:10:00.155164 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565670-lx56g"] Mar 19 17:10:00 crc kubenswrapper[4918]: I0319 17:10:00.166223 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqmqp\" (UniqueName: \"kubernetes.io/projected/acd12134-68b7-44a4-aa04-33af054e4f4f-kube-api-access-jqmqp\") pod \"auto-csr-approver-29565670-lx56g\" (UID: \"acd12134-68b7-44a4-aa04-33af054e4f4f\") " pod="openshift-infra/auto-csr-approver-29565670-lx56g" Mar 19 17:10:00 crc kubenswrapper[4918]: I0319 17:10:00.268952 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqmqp\" (UniqueName: \"kubernetes.io/projected/acd12134-68b7-44a4-aa04-33af054e4f4f-kube-api-access-jqmqp\") pod \"auto-csr-approver-29565670-lx56g\" (UID: \"acd12134-68b7-44a4-aa04-33af054e4f4f\") " pod="openshift-infra/auto-csr-approver-29565670-lx56g" Mar 19 17:10:00 crc kubenswrapper[4918]: I0319 17:10:00.290084 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqmqp\" (UniqueName: \"kubernetes.io/projected/acd12134-68b7-44a4-aa04-33af054e4f4f-kube-api-access-jqmqp\") pod \"auto-csr-approver-29565670-lx56g\" (UID: \"acd12134-68b7-44a4-aa04-33af054e4f4f\") " pod="openshift-infra/auto-csr-approver-29565670-lx56g" Mar 19 17:10:00 crc kubenswrapper[4918]: I0319 17:10:00.465579 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565670-lx56g" Mar 19 17:10:01 crc kubenswrapper[4918]: I0319 17:10:01.161595 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565670-lx56g"] Mar 19 17:10:01 crc kubenswrapper[4918]: I0319 17:10:01.331077 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565670-lx56g" event={"ID":"acd12134-68b7-44a4-aa04-33af054e4f4f","Type":"ContainerStarted","Data":"cfb5b72d573b6761b7e632ac63821a9b44dea00b2210d226b26c4ae9dad1fd49"} Mar 19 17:10:02 crc kubenswrapper[4918]: I0319 17:10:02.630487 4918 scope.go:117] "RemoveContainer" containerID="4a48933e0e27353fa09de1f9da9ded8f0beab69610b686d40595fe0cd849630c" Mar 19 17:10:02 crc kubenswrapper[4918]: I0319 17:10:02.665593 4918 scope.go:117] "RemoveContainer" containerID="704b834d69a79dd7b2c6a1926510ceaca88d7bbf10e95f1b61e7a4c2f69b7589" Mar 19 17:10:02 crc kubenswrapper[4918]: I0319 17:10:02.749741 4918 scope.go:117] "RemoveContainer" containerID="71aa12cca7c8e17912d185902652440b8bc41f140ed7f31fd63116e5f0792eb0" Mar 19 17:10:02 crc kubenswrapper[4918]: I0319 17:10:02.793500 4918 scope.go:117] "RemoveContainer" containerID="7e49afe0d15d5eb780358ee1a2d83875cd9954654a2764c68154e5ed5992076e" Mar 19 17:10:02 crc kubenswrapper[4918]: I0319 17:10:02.857165 4918 scope.go:117] "RemoveContainer" containerID="0f2ac0a926325347309616ffa21ea87676ff2f84cf06236ad8a38659f01962f7" Mar 19 17:10:02 crc kubenswrapper[4918]: I0319 17:10:02.919809 4918 scope.go:117] "RemoveContainer" containerID="71defd4c48e052b1a32122ae7d93eb7f3040b1baa7a1668e40c6739d004d6dcc" Mar 19 17:10:02 crc kubenswrapper[4918]: I0319 17:10:02.949172 4918 scope.go:117] "RemoveContainer" containerID="32cdd41f81602138845c90016f85b37554af18208fd8596a57f201cfd38bddd8" Mar 19 17:10:02 crc kubenswrapper[4918]: I0319 17:10:02.972120 4918 scope.go:117] "RemoveContainer" containerID="0e116879497271fbf86f1c3d662d1b8e1ea280943fe7c54b7ecb303deb4589cc" Mar 19 17:10:02 crc kubenswrapper[4918]: I0319 17:10:02.994167 4918 scope.go:117] "RemoveContainer" containerID="692d053ee950319107e100f82a93633073c285df427d10ad3d062c7e8cb668b6" Mar 19 17:10:03 crc kubenswrapper[4918]: I0319 17:10:03.015771 4918 scope.go:117] "RemoveContainer" containerID="8f6340081e4f0336fe36ff9d5be0e865325ece97c27d83ac5c359ccf718d70cd" Mar 19 17:10:03 crc kubenswrapper[4918]: I0319 17:10:03.039651 4918 scope.go:117] "RemoveContainer" containerID="97a21f9941aa6f2460a365553ecdf95b72e8ad797cc1f1b82c7872361bdd136c" Mar 19 17:10:03 crc kubenswrapper[4918]: I0319 17:10:03.063734 4918 scope.go:117] "RemoveContainer" containerID="e3a6db7b1c305254d513e876d3a9924e7b884647e2f7f3abc7963f0a5e9d2709" Mar 19 17:10:03 crc kubenswrapper[4918]: I0319 17:10:03.092612 4918 scope.go:117] "RemoveContainer" containerID="85c9d4b2b89e7d30dcb5701c311a1b8e2e9c90ade226b376f9ee69116c2f9711" Mar 19 17:10:03 crc kubenswrapper[4918]: I0319 17:10:03.129900 4918 scope.go:117] "RemoveContainer" containerID="8a2ea17b5d7a30f6cd1c011eb32615c38a924371449beec108b262bfdbbc43d5" Mar 19 17:10:03 crc kubenswrapper[4918]: I0319 17:10:03.188385 4918 scope.go:117] "RemoveContainer" containerID="4a68e2045d7ffc6b18fe5c2a035a1e8fb46086735192b921ce34ba35bbd807ed" Mar 19 17:10:03 crc kubenswrapper[4918]: I0319 17:10:03.217360 4918 scope.go:117] "RemoveContainer" containerID="446e68559bf5da44522ac487d278a52433ae0ceb83d617b69fd8206585f93b09" Mar 19 17:10:03 crc kubenswrapper[4918]: I0319 17:10:03.261154 4918 scope.go:117] "RemoveContainer" containerID="d45b235408e96a70149dddbfea3d61222f32c14de4bb00da8bf32715a980d435" Mar 19 17:10:04 crc kubenswrapper[4918]: I0319 17:10:04.409769 4918 generic.go:334] "Generic (PLEG): container finished" podID="acd12134-68b7-44a4-aa04-33af054e4f4f" containerID="9b8265d9e7a37ca5f1baa12eb33fcd448229e149c9de8015409b9f4932627c3a" exitCode=0 Mar 19 17:10:04 crc kubenswrapper[4918]: I0319 17:10:04.410035 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565670-lx56g" event={"ID":"acd12134-68b7-44a4-aa04-33af054e4f4f","Type":"ContainerDied","Data":"9b8265d9e7a37ca5f1baa12eb33fcd448229e149c9de8015409b9f4932627c3a"} Mar 19 17:10:06 crc kubenswrapper[4918]: I0319 17:10:06.315541 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565670-lx56g" Mar 19 17:10:06 crc kubenswrapper[4918]: I0319 17:10:06.390844 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqmqp\" (UniqueName: \"kubernetes.io/projected/acd12134-68b7-44a4-aa04-33af054e4f4f-kube-api-access-jqmqp\") pod \"acd12134-68b7-44a4-aa04-33af054e4f4f\" (UID: \"acd12134-68b7-44a4-aa04-33af054e4f4f\") " Mar 19 17:10:06 crc kubenswrapper[4918]: I0319 17:10:06.397130 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd12134-68b7-44a4-aa04-33af054e4f4f-kube-api-access-jqmqp" (OuterVolumeSpecName: "kube-api-access-jqmqp") pod "acd12134-68b7-44a4-aa04-33af054e4f4f" (UID: "acd12134-68b7-44a4-aa04-33af054e4f4f"). InnerVolumeSpecName "kube-api-access-jqmqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:10:06 crc kubenswrapper[4918]: I0319 17:10:06.430735 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565670-lx56g" event={"ID":"acd12134-68b7-44a4-aa04-33af054e4f4f","Type":"ContainerDied","Data":"cfb5b72d573b6761b7e632ac63821a9b44dea00b2210d226b26c4ae9dad1fd49"} Mar 19 17:10:06 crc kubenswrapper[4918]: I0319 17:10:06.430774 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfb5b72d573b6761b7e632ac63821a9b44dea00b2210d226b26c4ae9dad1fd49" Mar 19 17:10:06 crc kubenswrapper[4918]: I0319 17:10:06.430823 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565670-lx56g" Mar 19 17:10:06 crc kubenswrapper[4918]: I0319 17:10:06.493735 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqmqp\" (UniqueName: \"kubernetes.io/projected/acd12134-68b7-44a4-aa04-33af054e4f4f-kube-api-access-jqmqp\") on node \"crc\" DevicePath \"\"" Mar 19 17:10:06 crc kubenswrapper[4918]: I0319 17:10:06.587313 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:10:06 crc kubenswrapper[4918]: E0319 17:10:06.587754 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:10:07 crc kubenswrapper[4918]: I0319 17:10:07.392220 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565664-rrn6b"] Mar 19 17:10:07 crc kubenswrapper[4918]: I0319 17:10:07.403148 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565664-rrn6b"] Mar 19 17:10:08 crc kubenswrapper[4918]: I0319 17:10:08.596706 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db2a848b-d81b-467c-b100-159fc77dd610" path="/var/lib/kubelet/pods/db2a848b-d81b-467c-b100-159fc77dd610/volumes" Mar 19 17:10:18 crc kubenswrapper[4918]: I0319 17:10:18.592696 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:10:18 crc kubenswrapper[4918]: E0319 17:10:18.593549 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:10:25 crc kubenswrapper[4918]: I0319 17:10:25.046252 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-s9rgt"] Mar 19 17:10:25 crc kubenswrapper[4918]: I0319 17:10:25.058993 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-s9rgt"] Mar 19 17:10:26 crc kubenswrapper[4918]: I0319 17:10:26.598426 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb250eb-e1c3-4a48-bf07-8cf4504466fb" path="/var/lib/kubelet/pods/3cb250eb-e1c3-4a48-bf07-8cf4504466fb/volumes" Mar 19 17:10:30 crc kubenswrapper[4918]: I0319 17:10:30.587208 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:10:30 crc kubenswrapper[4918]: E0319 17:10:30.587999 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:10:39 crc kubenswrapper[4918]: I0319 17:10:39.025235 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xbp92"] Mar 19 17:10:39 crc kubenswrapper[4918]: I0319 17:10:39.037128 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xbp92"] Mar 19 17:10:39 crc kubenswrapper[4918]: I0319 17:10:39.045985 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-99gbh"] Mar 19 17:10:39 crc kubenswrapper[4918]: I0319 17:10:39.054889 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-99gbh"] Mar 19 17:10:40 crc kubenswrapper[4918]: I0319 17:10:40.599433 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="044c141c-5c54-4e8c-a592-497a22f6f4db" path="/var/lib/kubelet/pods/044c141c-5c54-4e8c-a592-497a22f6f4db/volumes" Mar 19 17:10:40 crc kubenswrapper[4918]: I0319 17:10:40.601376 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24181b4-a2be-4ea7-9602-e3e16b8862c1" path="/var/lib/kubelet/pods/e24181b4-a2be-4ea7-9602-e3e16b8862c1/volumes" Mar 19 17:10:45 crc kubenswrapper[4918]: I0319 17:10:45.586924 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:10:45 crc kubenswrapper[4918]: E0319 17:10:45.587846 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:10:53 crc kubenswrapper[4918]: I0319 17:10:53.053014 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wtk47"] Mar 19 17:10:53 crc kubenswrapper[4918]: I0319 17:10:53.069303 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wtk47"] Mar 19 17:10:54 crc kubenswrapper[4918]: I0319 17:10:54.031370 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-b5btd"] Mar 19 17:10:54 crc kubenswrapper[4918]: I0319 17:10:54.042027 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-b5btd"] Mar 19 17:10:54 crc kubenswrapper[4918]: I0319 17:10:54.599554 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33152bb1-e526-420f-8dec-7ef80c68b47c" path="/var/lib/kubelet/pods/33152bb1-e526-420f-8dec-7ef80c68b47c/volumes" Mar 19 17:10:54 crc kubenswrapper[4918]: I0319 17:10:54.600261 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec2f9e01-6e64-4c5d-93d4-8428ae776a4e" path="/var/lib/kubelet/pods/ec2f9e01-6e64-4c5d-93d4-8428ae776a4e/volumes" Mar 19 17:10:59 crc kubenswrapper[4918]: I0319 17:10:59.916824 4918 generic.go:334] "Generic (PLEG): container finished" podID="04af1485-802e-4821-a499-683301ee97ff" containerID="d0aa7f8a24dcfee4d290278121a9dc936d984e2cc462f9ae4c74ea2041626f30" exitCode=0 Mar 19 17:10:59 crc kubenswrapper[4918]: I0319 17:10:59.916999 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" event={"ID":"04af1485-802e-4821-a499-683301ee97ff","Type":"ContainerDied","Data":"d0aa7f8a24dcfee4d290278121a9dc936d984e2cc462f9ae4c74ea2041626f30"} Mar 19 17:11:00 crc kubenswrapper[4918]: I0319 17:11:00.586922 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:11:00 crc kubenswrapper[4918]: I0319 17:11:00.927509 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"fa72b2dda22c6725c1ba0de7800b932e612ef8adf06d0be5f45d4b0d25a364f6"} Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.085828 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.189036 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4sgr\" (UniqueName: \"kubernetes.io/projected/04af1485-802e-4821-a499-683301ee97ff-kube-api-access-m4sgr\") pod \"04af1485-802e-4821-a499-683301ee97ff\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.190233 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-inventory\") pod \"04af1485-802e-4821-a499-683301ee97ff\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.190764 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-ssh-key-openstack-edpm-ipam\") pod \"04af1485-802e-4821-a499-683301ee97ff\" (UID: \"04af1485-802e-4821-a499-683301ee97ff\") " Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.198007 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04af1485-802e-4821-a499-683301ee97ff-kube-api-access-m4sgr" (OuterVolumeSpecName: "kube-api-access-m4sgr") pod "04af1485-802e-4821-a499-683301ee97ff" (UID: "04af1485-802e-4821-a499-683301ee97ff"). InnerVolumeSpecName "kube-api-access-m4sgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.224271 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "04af1485-802e-4821-a499-683301ee97ff" (UID: "04af1485-802e-4821-a499-683301ee97ff"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.232782 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-inventory" (OuterVolumeSpecName: "inventory") pod "04af1485-802e-4821-a499-683301ee97ff" (UID: "04af1485-802e-4821-a499-683301ee97ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.293606 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4sgr\" (UniqueName: \"kubernetes.io/projected/04af1485-802e-4821-a499-683301ee97ff-kube-api-access-m4sgr\") on node \"crc\" DevicePath \"\"" Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.293650 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.293665 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04af1485-802e-4821-a499-683301ee97ff-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.947125 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" event={"ID":"04af1485-802e-4821-a499-683301ee97ff","Type":"ContainerDied","Data":"01469e9c951ce38cfced209937f67bc1c36f80f816cdc4ae86973e9501f63dc2"} Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.947434 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01469e9c951ce38cfced209937f67bc1c36f80f816cdc4ae86973e9501f63dc2" Mar 19 17:11:02 crc kubenswrapper[4918]: I0319 17:11:02.947221 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-szqtt" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.186066 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h"] Mar 19 17:11:03 crc kubenswrapper[4918]: E0319 17:11:03.186663 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd12134-68b7-44a4-aa04-33af054e4f4f" containerName="oc" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.186703 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd12134-68b7-44a4-aa04-33af054e4f4f" containerName="oc" Mar 19 17:11:03 crc kubenswrapper[4918]: E0319 17:11:03.186725 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04af1485-802e-4821-a499-683301ee97ff" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.186735 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="04af1485-802e-4821-a499-683301ee97ff" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.187203 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd12134-68b7-44a4-aa04-33af054e4f4f" containerName="oc" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.187219 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="04af1485-802e-4821-a499-683301ee97ff" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.188245 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.190016 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.190350 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.193567 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.195398 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.199702 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h"] Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.212795 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs87c\" (UniqueName: \"kubernetes.io/projected/82bf14ac-828e-41e8-987c-bb83598d73a5-kube-api-access-rs87c\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.213160 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.213341 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.314221 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.314322 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.314406 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs87c\" (UniqueName: \"kubernetes.io/projected/82bf14ac-828e-41e8-987c-bb83598d73a5-kube-api-access-rs87c\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.326249 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.328095 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.332255 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs87c\" (UniqueName: \"kubernetes.io/projected/82bf14ac-828e-41e8-987c-bb83598d73a5-kube-api-access-rs87c\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.506272 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.651357 4918 scope.go:117] "RemoveContainer" containerID="045157a57de8912063aed8fd4ca2742d4129523126e13f84e24127942f93f280" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.692803 4918 scope.go:117] "RemoveContainer" containerID="a29059ded9ff42bd4ad3193cd68efe1dcc3b1c77c2a12e1f44bb92c3748cc7e1" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.770367 4918 scope.go:117] "RemoveContainer" containerID="3adb850ada405a057d34bf29d1f2e7cadee6568d3b1228664b17581b178fdf5a" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.800109 4918 scope.go:117] "RemoveContainer" containerID="a6d303f1b98c98fb5b59b7b20965447a9d940333ef1b4f7fa839403f531657c6" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.853947 4918 scope.go:117] "RemoveContainer" containerID="a5609c1803193b430e36071ecf6281a611ec66b983c0fdc16c41cd396d0ef3b0" Mar 19 17:11:03 crc kubenswrapper[4918]: I0319 17:11:03.889563 4918 scope.go:117] "RemoveContainer" containerID="7093aa8e10a56c9917db6b81465c1586af7d9d5bdaac5bb85d7c30092080d813" Mar 19 17:11:04 crc kubenswrapper[4918]: W0319 17:11:04.326971 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82bf14ac_828e_41e8_987c_bb83598d73a5.slice/crio-f6fddfdb4910c8eb9bdacf6414eb16fc4378d8d29ea6372889e228519f492a5a WatchSource:0}: Error finding container f6fddfdb4910c8eb9bdacf6414eb16fc4378d8d29ea6372889e228519f492a5a: Status 404 returned error can't find the container with id f6fddfdb4910c8eb9bdacf6414eb16fc4378d8d29ea6372889e228519f492a5a Mar 19 17:11:04 crc kubenswrapper[4918]: I0319 17:11:04.339766 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h"] Mar 19 17:11:04 crc kubenswrapper[4918]: I0319 17:11:04.970968 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" event={"ID":"82bf14ac-828e-41e8-987c-bb83598d73a5","Type":"ContainerStarted","Data":"f6fddfdb4910c8eb9bdacf6414eb16fc4378d8d29ea6372889e228519f492a5a"} Mar 19 17:11:05 crc kubenswrapper[4918]: I0319 17:11:05.981737 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" event={"ID":"82bf14ac-828e-41e8-987c-bb83598d73a5","Type":"ContainerStarted","Data":"f33219ab1857192bda3d9b0c5cfbfd70b25685e4c4ff0d245e27022557726ecc"} Mar 19 17:11:06 crc kubenswrapper[4918]: I0319 17:11:06.007878 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" podStartSLOduration=2.309301578 podStartE2EDuration="3.007863261s" podCreationTimestamp="2026-03-19 17:11:03 +0000 UTC" firstStartedPulling="2026-03-19 17:11:04.331786053 +0000 UTC m=+1876.453985301" lastFinishedPulling="2026-03-19 17:11:05.030347726 +0000 UTC m=+1877.152546984" observedRunningTime="2026-03-19 17:11:06.001256769 +0000 UTC m=+1878.123456017" watchObservedRunningTime="2026-03-19 17:11:06.007863261 +0000 UTC m=+1878.130062509" Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.055259 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1eef-account-create-update-mz92v"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.065340 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-6z7jz"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.079296 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-10c1-account-create-update-qpxpb"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.087657 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-s8jqb"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.095845 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1eef-account-create-update-mz92v"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.103404 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fbfd-account-create-update-cvxhj"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.111663 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-w4vzm"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.120063 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-s8jqb"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.130478 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-6z7jz"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.139456 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-w4vzm"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.147396 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-fbfd-account-create-update-cvxhj"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.155187 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-10c1-account-create-update-qpxpb"] Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.598780 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00939ae6-93f7-437d-904c-53eaf4c4fc52" path="/var/lib/kubelet/pods/00939ae6-93f7-437d-904c-53eaf4c4fc52/volumes" Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.599805 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b2236d3-b352-4383-b099-0a1b39bdf222" path="/var/lib/kubelet/pods/4b2236d3-b352-4383-b099-0a1b39bdf222/volumes" Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.600472 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c57e4da-677f-401e-b718-b25a5678a352" path="/var/lib/kubelet/pods/7c57e4da-677f-401e-b718-b25a5678a352/volumes" Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.601365 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ceabd21-35af-489d-abc9-b4e8b629efd9" path="/var/lib/kubelet/pods/8ceabd21-35af-489d-abc9-b4e8b629efd9/volumes" Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.602640 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928c9b0d-6f27-4359-8b86-794d73ea9cd5" path="/var/lib/kubelet/pods/928c9b0d-6f27-4359-8b86-794d73ea9cd5/volumes" Mar 19 17:11:40 crc kubenswrapper[4918]: I0319 17:11:40.603299 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d53b74e8-6505-42fc-bbb6-9f9f2b96f747" path="/var/lib/kubelet/pods/d53b74e8-6505-42fc-bbb6-9f9f2b96f747/volumes" Mar 19 17:12:00 crc kubenswrapper[4918]: I0319 17:12:00.149869 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565672-r8jfj"] Mar 19 17:12:00 crc kubenswrapper[4918]: I0319 17:12:00.152394 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565672-r8jfj" Mar 19 17:12:00 crc kubenswrapper[4918]: I0319 17:12:00.155191 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:12:00 crc kubenswrapper[4918]: I0319 17:12:00.155306 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:12:00 crc kubenswrapper[4918]: I0319 17:12:00.155460 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:12:00 crc kubenswrapper[4918]: I0319 17:12:00.171404 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565672-r8jfj"] Mar 19 17:12:00 crc kubenswrapper[4918]: I0319 17:12:00.243501 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mcx4\" (UniqueName: \"kubernetes.io/projected/77605f10-6a56-4633-8433-3b6078dec967-kube-api-access-6mcx4\") pod \"auto-csr-approver-29565672-r8jfj\" (UID: \"77605f10-6a56-4633-8433-3b6078dec967\") " pod="openshift-infra/auto-csr-approver-29565672-r8jfj" Mar 19 17:12:00 crc kubenswrapper[4918]: I0319 17:12:00.345541 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mcx4\" (UniqueName: \"kubernetes.io/projected/77605f10-6a56-4633-8433-3b6078dec967-kube-api-access-6mcx4\") pod \"auto-csr-approver-29565672-r8jfj\" (UID: \"77605f10-6a56-4633-8433-3b6078dec967\") " pod="openshift-infra/auto-csr-approver-29565672-r8jfj" Mar 19 17:12:00 crc kubenswrapper[4918]: I0319 17:12:00.369783 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mcx4\" (UniqueName: \"kubernetes.io/projected/77605f10-6a56-4633-8433-3b6078dec967-kube-api-access-6mcx4\") pod \"auto-csr-approver-29565672-r8jfj\" (UID: \"77605f10-6a56-4633-8433-3b6078dec967\") " pod="openshift-infra/auto-csr-approver-29565672-r8jfj" Mar 19 17:12:00 crc kubenswrapper[4918]: I0319 17:12:00.487073 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565672-r8jfj" Mar 19 17:12:01 crc kubenswrapper[4918]: I0319 17:12:01.027075 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565672-r8jfj"] Mar 19 17:12:01 crc kubenswrapper[4918]: I0319 17:12:01.553011 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565672-r8jfj" event={"ID":"77605f10-6a56-4633-8433-3b6078dec967","Type":"ContainerStarted","Data":"2a51b7f97fe0c6f7a44c74308ea4d7e0cf0691af59769f3676df760f355c2a0b"} Mar 19 17:12:03 crc kubenswrapper[4918]: I0319 17:12:03.575072 4918 generic.go:334] "Generic (PLEG): container finished" podID="77605f10-6a56-4633-8433-3b6078dec967" containerID="865bd646348bd4ee3b7e00bdd02223d66d401a48c7d3b69c29648039005b4e50" exitCode=0 Mar 19 17:12:03 crc kubenswrapper[4918]: I0319 17:12:03.575144 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565672-r8jfj" event={"ID":"77605f10-6a56-4633-8433-3b6078dec967","Type":"ContainerDied","Data":"865bd646348bd4ee3b7e00bdd02223d66d401a48c7d3b69c29648039005b4e50"} Mar 19 17:12:04 crc kubenswrapper[4918]: I0319 17:12:04.046126 4918 scope.go:117] "RemoveContainer" containerID="7fb0fce364b7c9a07ce7ecdf71bf4b623d657462eed1ef765c0cbbd82ce919aa" Mar 19 17:12:04 crc kubenswrapper[4918]: I0319 17:12:04.088014 4918 scope.go:117] "RemoveContainer" containerID="1c72f339924db22c3d72d056f549194307988b495fdcb9c1a06ef81b0110fb1a" Mar 19 17:12:04 crc kubenswrapper[4918]: I0319 17:12:04.133955 4918 scope.go:117] "RemoveContainer" containerID="e4771b68ab4d8353e4b84cea1cc82cbe3b9586758f22f67fe88f0bdd1cf0ec3a" Mar 19 17:12:04 crc kubenswrapper[4918]: I0319 17:12:04.172223 4918 scope.go:117] "RemoveContainer" containerID="51c2aee85b6d862aaae79e36e07c279709e0c6b653105aa85e70feda2a48f452" Mar 19 17:12:04 crc kubenswrapper[4918]: I0319 17:12:04.224419 4918 scope.go:117] "RemoveContainer" containerID="801ff444204497c12ffc78f2cff6e95c81375619bfcdf7dc26556893b1c6e9d7" Mar 19 17:12:04 crc kubenswrapper[4918]: I0319 17:12:04.273885 4918 scope.go:117] "RemoveContainer" containerID="a7521016cee0928b0ce0ac4ce84e5e555a57beb6493320858df45f9ff3dad306" Mar 19 17:12:05 crc kubenswrapper[4918]: I0319 17:12:05.318640 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565672-r8jfj" Mar 19 17:12:05 crc kubenswrapper[4918]: I0319 17:12:05.382986 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mcx4\" (UniqueName: \"kubernetes.io/projected/77605f10-6a56-4633-8433-3b6078dec967-kube-api-access-6mcx4\") pod \"77605f10-6a56-4633-8433-3b6078dec967\" (UID: \"77605f10-6a56-4633-8433-3b6078dec967\") " Mar 19 17:12:05 crc kubenswrapper[4918]: I0319 17:12:05.389208 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77605f10-6a56-4633-8433-3b6078dec967-kube-api-access-6mcx4" (OuterVolumeSpecName: "kube-api-access-6mcx4") pod "77605f10-6a56-4633-8433-3b6078dec967" (UID: "77605f10-6a56-4633-8433-3b6078dec967"). InnerVolumeSpecName "kube-api-access-6mcx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:12:05 crc kubenswrapper[4918]: I0319 17:12:05.485495 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mcx4\" (UniqueName: \"kubernetes.io/projected/77605f10-6a56-4633-8433-3b6078dec967-kube-api-access-6mcx4\") on node \"crc\" DevicePath \"\"" Mar 19 17:12:05 crc kubenswrapper[4918]: I0319 17:12:05.613703 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565672-r8jfj" event={"ID":"77605f10-6a56-4633-8433-3b6078dec967","Type":"ContainerDied","Data":"2a51b7f97fe0c6f7a44c74308ea4d7e0cf0691af59769f3676df760f355c2a0b"} Mar 19 17:12:05 crc kubenswrapper[4918]: I0319 17:12:05.613796 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a51b7f97fe0c6f7a44c74308ea4d7e0cf0691af59769f3676df760f355c2a0b" Mar 19 17:12:05 crc kubenswrapper[4918]: I0319 17:12:05.613813 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565672-r8jfj" Mar 19 17:12:06 crc kubenswrapper[4918]: I0319 17:12:06.394579 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565666-rjj5f"] Mar 19 17:12:06 crc kubenswrapper[4918]: I0319 17:12:06.408671 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565666-rjj5f"] Mar 19 17:12:06 crc kubenswrapper[4918]: I0319 17:12:06.602901 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8654396-60ef-431c-94f0-6db3b3f225d5" path="/var/lib/kubelet/pods/e8654396-60ef-431c-94f0-6db3b3f225d5/volumes" Mar 19 17:12:15 crc kubenswrapper[4918]: I0319 17:12:15.706030 4918 generic.go:334] "Generic (PLEG): container finished" podID="82bf14ac-828e-41e8-987c-bb83598d73a5" containerID="f33219ab1857192bda3d9b0c5cfbfd70b25685e4c4ff0d245e27022557726ecc" exitCode=0 Mar 19 17:12:15 crc kubenswrapper[4918]: I0319 17:12:15.706124 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" event={"ID":"82bf14ac-828e-41e8-987c-bb83598d73a5","Type":"ContainerDied","Data":"f33219ab1857192bda3d9b0c5cfbfd70b25685e4c4ff0d245e27022557726ecc"} Mar 19 17:12:16 crc kubenswrapper[4918]: I0319 17:12:16.033238 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6x2b9"] Mar 19 17:12:16 crc kubenswrapper[4918]: I0319 17:12:16.045879 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6x2b9"] Mar 19 17:12:16 crc kubenswrapper[4918]: I0319 17:12:16.613821 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64832abb-ee37-4a90-9fae-8eff52ff08e2" path="/var/lib/kubelet/pods/64832abb-ee37-4a90-9fae-8eff52ff08e2/volumes" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.680823 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.732589 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" event={"ID":"82bf14ac-828e-41e8-987c-bb83598d73a5","Type":"ContainerDied","Data":"f6fddfdb4910c8eb9bdacf6414eb16fc4378d8d29ea6372889e228519f492a5a"} Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.732644 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6fddfdb4910c8eb9bdacf6414eb16fc4378d8d29ea6372889e228519f492a5a" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.732718 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.741090 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs87c\" (UniqueName: \"kubernetes.io/projected/82bf14ac-828e-41e8-987c-bb83598d73a5-kube-api-access-rs87c\") pod \"82bf14ac-828e-41e8-987c-bb83598d73a5\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.741270 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-ssh-key-openstack-edpm-ipam\") pod \"82bf14ac-828e-41e8-987c-bb83598d73a5\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.741311 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-inventory\") pod \"82bf14ac-828e-41e8-987c-bb83598d73a5\" (UID: \"82bf14ac-828e-41e8-987c-bb83598d73a5\") " Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.751379 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82bf14ac-828e-41e8-987c-bb83598d73a5-kube-api-access-rs87c" (OuterVolumeSpecName: "kube-api-access-rs87c") pod "82bf14ac-828e-41e8-987c-bb83598d73a5" (UID: "82bf14ac-828e-41e8-987c-bb83598d73a5"). InnerVolumeSpecName "kube-api-access-rs87c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.795918 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-inventory" (OuterVolumeSpecName: "inventory") pod "82bf14ac-828e-41e8-987c-bb83598d73a5" (UID: "82bf14ac-828e-41e8-987c-bb83598d73a5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.819406 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "82bf14ac-828e-41e8-987c-bb83598d73a5" (UID: "82bf14ac-828e-41e8-987c-bb83598d73a5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.831124 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw"] Mar 19 17:12:17 crc kubenswrapper[4918]: E0319 17:12:17.831670 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77605f10-6a56-4633-8433-3b6078dec967" containerName="oc" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.831693 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="77605f10-6a56-4633-8433-3b6078dec967" containerName="oc" Mar 19 17:12:17 crc kubenswrapper[4918]: E0319 17:12:17.831738 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82bf14ac-828e-41e8-987c-bb83598d73a5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.831749 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="82bf14ac-828e-41e8-987c-bb83598d73a5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.831963 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="77605f10-6a56-4633-8433-3b6078dec967" containerName="oc" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.831996 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="82bf14ac-828e-41e8-987c-bb83598d73a5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.833043 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.843570 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.843698 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtqfh\" (UniqueName: \"kubernetes.io/projected/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-kube-api-access-rtqfh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.843751 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.843828 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs87c\" (UniqueName: \"kubernetes.io/projected/82bf14ac-828e-41e8-987c-bb83598d73a5-kube-api-access-rs87c\") on node \"crc\" DevicePath \"\"" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.843844 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.843854 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82bf14ac-828e-41e8-987c-bb83598d73a5-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.860461 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw"] Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.945936 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.946542 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtqfh\" (UniqueName: \"kubernetes.io/projected/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-kube-api-access-rtqfh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.946619 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.948926 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.950635 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:17 crc kubenswrapper[4918]: I0319 17:12:17.961205 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtqfh\" (UniqueName: \"kubernetes.io/projected/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-kube-api-access-rtqfh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:18 crc kubenswrapper[4918]: I0319 17:12:18.216065 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:18 crc kubenswrapper[4918]: I0319 17:12:18.813437 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw"] Mar 19 17:12:19 crc kubenswrapper[4918]: I0319 17:12:19.755309 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" event={"ID":"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6","Type":"ContainerStarted","Data":"c4ce52ac597d9f29a9372d09bb265deb5c59b2647a4f0cca0fbd49e349ba357b"} Mar 19 17:12:20 crc kubenswrapper[4918]: I0319 17:12:20.768149 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" event={"ID":"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6","Type":"ContainerStarted","Data":"3417aee87cfd9f9ff9095e026938a24b7dda1d30355eacf219f6f1198142be99"} Mar 19 17:12:20 crc kubenswrapper[4918]: I0319 17:12:20.797158 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" podStartSLOduration=3.072524424 podStartE2EDuration="3.797130714s" podCreationTimestamp="2026-03-19 17:12:17 +0000 UTC" firstStartedPulling="2026-03-19 17:12:18.800748668 +0000 UTC m=+1950.922947926" lastFinishedPulling="2026-03-19 17:12:19.525354968 +0000 UTC m=+1951.647554216" observedRunningTime="2026-03-19 17:12:20.789811714 +0000 UTC m=+1952.912010972" watchObservedRunningTime="2026-03-19 17:12:20.797130714 +0000 UTC m=+1952.919330022" Mar 19 17:12:24 crc kubenswrapper[4918]: I0319 17:12:24.811319 4918 generic.go:334] "Generic (PLEG): container finished" podID="e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6" containerID="3417aee87cfd9f9ff9095e026938a24b7dda1d30355eacf219f6f1198142be99" exitCode=0 Mar 19 17:12:24 crc kubenswrapper[4918]: I0319 17:12:24.811397 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" event={"ID":"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6","Type":"ContainerDied","Data":"3417aee87cfd9f9ff9095e026938a24b7dda1d30355eacf219f6f1198142be99"} Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.629153 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.728121 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-ssh-key-openstack-edpm-ipam\") pod \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.728376 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-inventory\") pod \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.728604 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtqfh\" (UniqueName: \"kubernetes.io/projected/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-kube-api-access-rtqfh\") pod \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\" (UID: \"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6\") " Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.741968 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-kube-api-access-rtqfh" (OuterVolumeSpecName: "kube-api-access-rtqfh") pod "e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6" (UID: "e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6"). InnerVolumeSpecName "kube-api-access-rtqfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.762216 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6" (UID: "e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.763492 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-inventory" (OuterVolumeSpecName: "inventory") pod "e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6" (UID: "e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.831199 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.831232 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.831243 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtqfh\" (UniqueName: \"kubernetes.io/projected/e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6-kube-api-access-rtqfh\") on node \"crc\" DevicePath \"\"" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.837423 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" event={"ID":"e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6","Type":"ContainerDied","Data":"c4ce52ac597d9f29a9372d09bb265deb5c59b2647a4f0cca0fbd49e349ba357b"} Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.837461 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4ce52ac597d9f29a9372d09bb265deb5c59b2647a4f0cca0fbd49e349ba357b" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.837482 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.924883 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c"] Mar 19 17:12:26 crc kubenswrapper[4918]: E0319 17:12:26.925453 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.925480 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.925842 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.926864 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.930152 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.930729 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.930737 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.936034 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:12:26 crc kubenswrapper[4918]: I0319 17:12:26.944311 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c"] Mar 19 17:12:27 crc kubenswrapper[4918]: I0319 17:12:27.034748 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nzw8c\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:12:27 crc kubenswrapper[4918]: I0319 17:12:27.034819 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fffm\" (UniqueName: \"kubernetes.io/projected/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-kube-api-access-9fffm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nzw8c\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:12:27 crc kubenswrapper[4918]: I0319 17:12:27.034968 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nzw8c\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:12:27 crc kubenswrapper[4918]: I0319 17:12:27.136975 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nzw8c\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:12:27 crc kubenswrapper[4918]: I0319 17:12:27.137050 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fffm\" (UniqueName: \"kubernetes.io/projected/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-kube-api-access-9fffm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nzw8c\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:12:27 crc kubenswrapper[4918]: I0319 17:12:27.137195 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nzw8c\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:12:27 crc kubenswrapper[4918]: I0319 17:12:27.142928 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nzw8c\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:12:27 crc kubenswrapper[4918]: I0319 17:12:27.143066 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nzw8c\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:12:27 crc kubenswrapper[4918]: I0319 17:12:27.153663 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fffm\" (UniqueName: \"kubernetes.io/projected/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-kube-api-access-9fffm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nzw8c\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:12:27 crc kubenswrapper[4918]: I0319 17:12:27.254738 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:12:27 crc kubenswrapper[4918]: I0319 17:12:27.840298 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c"] Mar 19 17:12:27 crc kubenswrapper[4918]: W0319 17:12:27.845711 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a5a8faa_ca9b_4aa7_aa51_8605063466d5.slice/crio-7de9ad279643a1bcc90245330c971ec2d1c3f223e33c8d9b245213f87d8c248e WatchSource:0}: Error finding container 7de9ad279643a1bcc90245330c971ec2d1c3f223e33c8d9b245213f87d8c248e: Status 404 returned error can't find the container with id 7de9ad279643a1bcc90245330c971ec2d1c3f223e33c8d9b245213f87d8c248e Mar 19 17:12:28 crc kubenswrapper[4918]: I0319 17:12:28.864614 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" event={"ID":"6a5a8faa-ca9b-4aa7-aa51-8605063466d5","Type":"ContainerStarted","Data":"9d33a271087f397969ed3c8d09f6fba31f9523507c9f7e0fae4f9801f0d24986"} Mar 19 17:12:28 crc kubenswrapper[4918]: I0319 17:12:28.865125 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" event={"ID":"6a5a8faa-ca9b-4aa7-aa51-8605063466d5","Type":"ContainerStarted","Data":"7de9ad279643a1bcc90245330c971ec2d1c3f223e33c8d9b245213f87d8c248e"} Mar 19 17:12:28 crc kubenswrapper[4918]: I0319 17:12:28.885866 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" podStartSLOduration=2.413832959 podStartE2EDuration="2.885844676s" podCreationTimestamp="2026-03-19 17:12:26 +0000 UTC" firstStartedPulling="2026-03-19 17:12:27.847616491 +0000 UTC m=+1959.969815739" lastFinishedPulling="2026-03-19 17:12:28.319628168 +0000 UTC m=+1960.441827456" observedRunningTime="2026-03-19 17:12:28.882917946 +0000 UTC m=+1961.005117224" watchObservedRunningTime="2026-03-19 17:12:28.885844676 +0000 UTC m=+1961.008043934" Mar 19 17:12:40 crc kubenswrapper[4918]: I0319 17:12:40.052253 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-nkvm2"] Mar 19 17:12:40 crc kubenswrapper[4918]: I0319 17:12:40.067963 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-nkvm2"] Mar 19 17:12:40 crc kubenswrapper[4918]: I0319 17:12:40.599599 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="780adca3-416f-431d-915e-7a546cfeae43" path="/var/lib/kubelet/pods/780adca3-416f-431d-915e-7a546cfeae43/volumes" Mar 19 17:12:42 crc kubenswrapper[4918]: I0319 17:12:42.048923 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xjm4z"] Mar 19 17:12:42 crc kubenswrapper[4918]: I0319 17:12:42.058713 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xjm4z"] Mar 19 17:12:42 crc kubenswrapper[4918]: I0319 17:12:42.598959 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4db6685-8155-4a95-af9f-3292270736d8" path="/var/lib/kubelet/pods/c4db6685-8155-4a95-af9f-3292270736d8/volumes" Mar 19 17:13:04 crc kubenswrapper[4918]: I0319 17:13:04.437089 4918 scope.go:117] "RemoveContainer" containerID="77b896d7a09e0888c6a3c0619b9e66213942a8c3225f1ab9c8aad24c2775bc99" Mar 19 17:13:04 crc kubenswrapper[4918]: I0319 17:13:04.486503 4918 scope.go:117] "RemoveContainer" containerID="c10802d2e613e8f6ad21c2d97f95480e1a21be7b013e652fc31377e981a53114" Mar 19 17:13:04 crc kubenswrapper[4918]: I0319 17:13:04.536706 4918 scope.go:117] "RemoveContainer" containerID="7b11558f97432b5091ce795b44cd33d301bd98634d90250482b412f94899f255" Mar 19 17:13:04 crc kubenswrapper[4918]: I0319 17:13:04.593966 4918 scope.go:117] "RemoveContainer" containerID="9b9d34fa71438f864c8513534831a69723212935ac49c153660d3e267d8c5b60" Mar 19 17:13:09 crc kubenswrapper[4918]: I0319 17:13:09.350921 4918 generic.go:334] "Generic (PLEG): container finished" podID="6a5a8faa-ca9b-4aa7-aa51-8605063466d5" containerID="9d33a271087f397969ed3c8d09f6fba31f9523507c9f7e0fae4f9801f0d24986" exitCode=0 Mar 19 17:13:09 crc kubenswrapper[4918]: I0319 17:13:09.351028 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" event={"ID":"6a5a8faa-ca9b-4aa7-aa51-8605063466d5","Type":"ContainerDied","Data":"9d33a271087f397969ed3c8d09f6fba31f9523507c9f7e0fae4f9801f0d24986"} Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.066113 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z8vcz"] Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.069608 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.098740 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z8vcz"] Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.190398 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-utilities\") pod \"community-operators-z8vcz\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.190484 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r54p\" (UniqueName: \"kubernetes.io/projected/7d0035cc-603c-41bd-8ddd-77db24c66064-kube-api-access-6r54p\") pod \"community-operators-z8vcz\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.190957 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-catalog-content\") pod \"community-operators-z8vcz\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.293047 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-utilities\") pod \"community-operators-z8vcz\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.293131 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r54p\" (UniqueName: \"kubernetes.io/projected/7d0035cc-603c-41bd-8ddd-77db24c66064-kube-api-access-6r54p\") pod \"community-operators-z8vcz\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.293218 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-catalog-content\") pod \"community-operators-z8vcz\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.293796 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-catalog-content\") pod \"community-operators-z8vcz\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.294019 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-utilities\") pod \"community-operators-z8vcz\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.313103 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r54p\" (UniqueName: \"kubernetes.io/projected/7d0035cc-603c-41bd-8ddd-77db24c66064-kube-api-access-6r54p\") pod \"community-operators-z8vcz\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:10 crc kubenswrapper[4918]: I0319 17:13:10.421065 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.118875 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z8vcz"] Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.250068 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.317374 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fffm\" (UniqueName: \"kubernetes.io/projected/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-kube-api-access-9fffm\") pod \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.317449 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-ssh-key-openstack-edpm-ipam\") pod \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.317503 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-inventory\") pod \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\" (UID: \"6a5a8faa-ca9b-4aa7-aa51-8605063466d5\") " Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.324652 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-kube-api-access-9fffm" (OuterVolumeSpecName: "kube-api-access-9fffm") pod "6a5a8faa-ca9b-4aa7-aa51-8605063466d5" (UID: "6a5a8faa-ca9b-4aa7-aa51-8605063466d5"). InnerVolumeSpecName "kube-api-access-9fffm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.349663 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6a5a8faa-ca9b-4aa7-aa51-8605063466d5" (UID: "6a5a8faa-ca9b-4aa7-aa51-8605063466d5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.350557 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-inventory" (OuterVolumeSpecName: "inventory") pod "6a5a8faa-ca9b-4aa7-aa51-8605063466d5" (UID: "6a5a8faa-ca9b-4aa7-aa51-8605063466d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.406354 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" event={"ID":"6a5a8faa-ca9b-4aa7-aa51-8605063466d5","Type":"ContainerDied","Data":"7de9ad279643a1bcc90245330c971ec2d1c3f223e33c8d9b245213f87d8c248e"} Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.406428 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de9ad279643a1bcc90245330c971ec2d1c3f223e33c8d9b245213f87d8c248e" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.406607 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nzw8c" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.409727 4918 generic.go:334] "Generic (PLEG): container finished" podID="7d0035cc-603c-41bd-8ddd-77db24c66064" containerID="59df572c3f27c93b36e87598123a1bbee53b464174732255b0658199ef96b694" exitCode=0 Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.409893 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z8vcz" event={"ID":"7d0035cc-603c-41bd-8ddd-77db24c66064","Type":"ContainerDied","Data":"59df572c3f27c93b36e87598123a1bbee53b464174732255b0658199ef96b694"} Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.409977 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z8vcz" event={"ID":"7d0035cc-603c-41bd-8ddd-77db24c66064","Type":"ContainerStarted","Data":"34b9a66e747b48002703f8706a43278d819b999746ff9ba466cc5907af898358"} Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.422921 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fffm\" (UniqueName: \"kubernetes.io/projected/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-kube-api-access-9fffm\") on node \"crc\" DevicePath \"\"" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.422956 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.422970 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a5a8faa-ca9b-4aa7-aa51-8605063466d5-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.423469 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.466668 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l"] Mar 19 17:13:11 crc kubenswrapper[4918]: E0319 17:13:11.467130 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5a8faa-ca9b-4aa7-aa51-8605063466d5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.467152 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5a8faa-ca9b-4aa7-aa51-8605063466d5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.467352 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5a8faa-ca9b-4aa7-aa51-8605063466d5" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.468126 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.471847 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.471988 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.472093 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.476659 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.483044 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l"] Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.524538 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwn8g\" (UniqueName: \"kubernetes.io/projected/8e658b37-8529-4e6f-adbc-0974b7957e57-kube-api-access-xwn8g\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.525210 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.525425 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.627128 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.627349 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.627425 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwn8g\" (UniqueName: \"kubernetes.io/projected/8e658b37-8529-4e6f-adbc-0974b7957e57-kube-api-access-xwn8g\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.633095 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.639724 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.654491 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwn8g\" (UniqueName: \"kubernetes.io/projected/8e658b37-8529-4e6f-adbc-0974b7957e57-kube-api-access-xwn8g\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:13:11 crc kubenswrapper[4918]: I0319 17:13:11.806913 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:13:12 crc kubenswrapper[4918]: I0319 17:13:12.393102 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l"] Mar 19 17:13:12 crc kubenswrapper[4918]: I0319 17:13:12.444145 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z8vcz" event={"ID":"7d0035cc-603c-41bd-8ddd-77db24c66064","Type":"ContainerStarted","Data":"33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5"} Mar 19 17:13:13 crc kubenswrapper[4918]: I0319 17:13:13.454128 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" event={"ID":"8e658b37-8529-4e6f-adbc-0974b7957e57","Type":"ContainerStarted","Data":"3d4af2bd6bfa75336e4ea5530907991f4d8f85155e45abea2e26434b44b9e018"} Mar 19 17:13:13 crc kubenswrapper[4918]: I0319 17:13:13.454510 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" event={"ID":"8e658b37-8529-4e6f-adbc-0974b7957e57","Type":"ContainerStarted","Data":"b0805682f2ce1d08092956138de41094b2f0d9973a58f8ffcb94a93729a6b5d1"} Mar 19 17:13:13 crc kubenswrapper[4918]: I0319 17:13:13.469666 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" podStartSLOduration=1.965083229 podStartE2EDuration="2.469648868s" podCreationTimestamp="2026-03-19 17:13:11 +0000 UTC" firstStartedPulling="2026-03-19 17:13:12.445386345 +0000 UTC m=+2004.567585593" lastFinishedPulling="2026-03-19 17:13:12.949951984 +0000 UTC m=+2005.072151232" observedRunningTime="2026-03-19 17:13:13.466874952 +0000 UTC m=+2005.589074200" watchObservedRunningTime="2026-03-19 17:13:13.469648868 +0000 UTC m=+2005.591848116" Mar 19 17:13:14 crc kubenswrapper[4918]: I0319 17:13:14.467245 4918 generic.go:334] "Generic (PLEG): container finished" podID="7d0035cc-603c-41bd-8ddd-77db24c66064" containerID="33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5" exitCode=0 Mar 19 17:13:14 crc kubenswrapper[4918]: I0319 17:13:14.467291 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z8vcz" event={"ID":"7d0035cc-603c-41bd-8ddd-77db24c66064","Type":"ContainerDied","Data":"33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5"} Mar 19 17:13:15 crc kubenswrapper[4918]: I0319 17:13:15.485685 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z8vcz" event={"ID":"7d0035cc-603c-41bd-8ddd-77db24c66064","Type":"ContainerStarted","Data":"df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997"} Mar 19 17:13:15 crc kubenswrapper[4918]: I0319 17:13:15.505162 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z8vcz" podStartSLOduration=1.969862892 podStartE2EDuration="5.505139675s" podCreationTimestamp="2026-03-19 17:13:10 +0000 UTC" firstStartedPulling="2026-03-19 17:13:11.423188949 +0000 UTC m=+2003.545388197" lastFinishedPulling="2026-03-19 17:13:14.958465722 +0000 UTC m=+2007.080664980" observedRunningTime="2026-03-19 17:13:15.501479355 +0000 UTC m=+2007.623678613" watchObservedRunningTime="2026-03-19 17:13:15.505139675 +0000 UTC m=+2007.627338933" Mar 19 17:13:20 crc kubenswrapper[4918]: I0319 17:13:20.422198 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:20 crc kubenswrapper[4918]: I0319 17:13:20.422887 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:20 crc kubenswrapper[4918]: I0319 17:13:20.478187 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:20 crc kubenswrapper[4918]: I0319 17:13:20.603451 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:20 crc kubenswrapper[4918]: I0319 17:13:20.715756 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z8vcz"] Mar 19 17:13:22 crc kubenswrapper[4918]: I0319 17:13:22.570259 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z8vcz" podUID="7d0035cc-603c-41bd-8ddd-77db24c66064" containerName="registry-server" containerID="cri-o://df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997" gracePeriod=2 Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.106071 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.176755 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-catalog-content\") pod \"7d0035cc-603c-41bd-8ddd-77db24c66064\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.176912 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r54p\" (UniqueName: \"kubernetes.io/projected/7d0035cc-603c-41bd-8ddd-77db24c66064-kube-api-access-6r54p\") pod \"7d0035cc-603c-41bd-8ddd-77db24c66064\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.177240 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-utilities\") pod \"7d0035cc-603c-41bd-8ddd-77db24c66064\" (UID: \"7d0035cc-603c-41bd-8ddd-77db24c66064\") " Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.178269 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-utilities" (OuterVolumeSpecName: "utilities") pod "7d0035cc-603c-41bd-8ddd-77db24c66064" (UID: "7d0035cc-603c-41bd-8ddd-77db24c66064"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.185706 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0035cc-603c-41bd-8ddd-77db24c66064-kube-api-access-6r54p" (OuterVolumeSpecName: "kube-api-access-6r54p") pod "7d0035cc-603c-41bd-8ddd-77db24c66064" (UID: "7d0035cc-603c-41bd-8ddd-77db24c66064"). InnerVolumeSpecName "kube-api-access-6r54p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.228802 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d0035cc-603c-41bd-8ddd-77db24c66064" (UID: "7d0035cc-603c-41bd-8ddd-77db24c66064"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.280360 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.280419 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d0035cc-603c-41bd-8ddd-77db24c66064-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.280442 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r54p\" (UniqueName: \"kubernetes.io/projected/7d0035cc-603c-41bd-8ddd-77db24c66064-kube-api-access-6r54p\") on node \"crc\" DevicePath \"\"" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.583649 4918 generic.go:334] "Generic (PLEG): container finished" podID="7d0035cc-603c-41bd-8ddd-77db24c66064" containerID="df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997" exitCode=0 Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.583727 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z8vcz" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.583723 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z8vcz" event={"ID":"7d0035cc-603c-41bd-8ddd-77db24c66064","Type":"ContainerDied","Data":"df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997"} Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.583793 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z8vcz" event={"ID":"7d0035cc-603c-41bd-8ddd-77db24c66064","Type":"ContainerDied","Data":"34b9a66e747b48002703f8706a43278d819b999746ff9ba466cc5907af898358"} Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.583829 4918 scope.go:117] "RemoveContainer" containerID="df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.620861 4918 scope.go:117] "RemoveContainer" containerID="33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.631656 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z8vcz"] Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.642130 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z8vcz"] Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.651875 4918 scope.go:117] "RemoveContainer" containerID="59df572c3f27c93b36e87598123a1bbee53b464174732255b0658199ef96b694" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.718869 4918 scope.go:117] "RemoveContainer" containerID="df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997" Mar 19 17:13:23 crc kubenswrapper[4918]: E0319 17:13:23.719499 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997\": container with ID starting with df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997 not found: ID does not exist" containerID="df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.719560 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997"} err="failed to get container status \"df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997\": rpc error: code = NotFound desc = could not find container \"df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997\": container with ID starting with df386b67ae462fca1289ef0fd6eacceb8981bf0360325e643cb8e1ed4a2d8997 not found: ID does not exist" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.719591 4918 scope.go:117] "RemoveContainer" containerID="33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5" Mar 19 17:13:23 crc kubenswrapper[4918]: E0319 17:13:23.719810 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5\": container with ID starting with 33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5 not found: ID does not exist" containerID="33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.719850 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5"} err="failed to get container status \"33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5\": rpc error: code = NotFound desc = could not find container \"33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5\": container with ID starting with 33b83f0e9337d44301cb9b580638cbe40309681f71bab798bbc76664fd0009f5 not found: ID does not exist" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.719878 4918 scope.go:117] "RemoveContainer" containerID="59df572c3f27c93b36e87598123a1bbee53b464174732255b0658199ef96b694" Mar 19 17:13:23 crc kubenswrapper[4918]: E0319 17:13:23.720084 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59df572c3f27c93b36e87598123a1bbee53b464174732255b0658199ef96b694\": container with ID starting with 59df572c3f27c93b36e87598123a1bbee53b464174732255b0658199ef96b694 not found: ID does not exist" containerID="59df572c3f27c93b36e87598123a1bbee53b464174732255b0658199ef96b694" Mar 19 17:13:23 crc kubenswrapper[4918]: I0319 17:13:23.720108 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59df572c3f27c93b36e87598123a1bbee53b464174732255b0658199ef96b694"} err="failed to get container status \"59df572c3f27c93b36e87598123a1bbee53b464174732255b0658199ef96b694\": rpc error: code = NotFound desc = could not find container \"59df572c3f27c93b36e87598123a1bbee53b464174732255b0658199ef96b694\": container with ID starting with 59df572c3f27c93b36e87598123a1bbee53b464174732255b0658199ef96b694 not found: ID does not exist" Mar 19 17:13:24 crc kubenswrapper[4918]: I0319 17:13:24.607325 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d0035cc-603c-41bd-8ddd-77db24c66064" path="/var/lib/kubelet/pods/7d0035cc-603c-41bd-8ddd-77db24c66064/volumes" Mar 19 17:13:27 crc kubenswrapper[4918]: I0319 17:13:27.056898 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-q4qqh"] Mar 19 17:13:27 crc kubenswrapper[4918]: I0319 17:13:27.067732 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-q4qqh"] Mar 19 17:13:28 crc kubenswrapper[4918]: I0319 17:13:28.212111 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:13:28 crc kubenswrapper[4918]: I0319 17:13:28.212166 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:13:28 crc kubenswrapper[4918]: I0319 17:13:28.601494 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83433c2c-ddcb-4f8d-ba54-e3dda42d12f7" path="/var/lib/kubelet/pods/83433c2c-ddcb-4f8d-ba54-e3dda42d12f7/volumes" Mar 19 17:13:58 crc kubenswrapper[4918]: I0319 17:13:58.211637 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:13:58 crc kubenswrapper[4918]: I0319 17:13:58.212360 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.152456 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565674-kcgq5"] Mar 19 17:14:00 crc kubenswrapper[4918]: E0319 17:14:00.153267 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0035cc-603c-41bd-8ddd-77db24c66064" containerName="extract-utilities" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.153283 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0035cc-603c-41bd-8ddd-77db24c66064" containerName="extract-utilities" Mar 19 17:14:00 crc kubenswrapper[4918]: E0319 17:14:00.153328 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0035cc-603c-41bd-8ddd-77db24c66064" containerName="registry-server" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.153336 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0035cc-603c-41bd-8ddd-77db24c66064" containerName="registry-server" Mar 19 17:14:00 crc kubenswrapper[4918]: E0319 17:14:00.153352 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0035cc-603c-41bd-8ddd-77db24c66064" containerName="extract-content" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.153360 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0035cc-603c-41bd-8ddd-77db24c66064" containerName="extract-content" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.153644 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d0035cc-603c-41bd-8ddd-77db24c66064" containerName="registry-server" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.154539 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565674-kcgq5" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.156597 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.156616 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.157022 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.161760 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565674-kcgq5"] Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.296782 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgzdz\" (UniqueName: \"kubernetes.io/projected/e4012bc5-f224-4c07-aca2-334f6137ed4b-kube-api-access-kgzdz\") pod \"auto-csr-approver-29565674-kcgq5\" (UID: \"e4012bc5-f224-4c07-aca2-334f6137ed4b\") " pod="openshift-infra/auto-csr-approver-29565674-kcgq5" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.400252 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgzdz\" (UniqueName: \"kubernetes.io/projected/e4012bc5-f224-4c07-aca2-334f6137ed4b-kube-api-access-kgzdz\") pod \"auto-csr-approver-29565674-kcgq5\" (UID: \"e4012bc5-f224-4c07-aca2-334f6137ed4b\") " pod="openshift-infra/auto-csr-approver-29565674-kcgq5" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.420248 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgzdz\" (UniqueName: \"kubernetes.io/projected/e4012bc5-f224-4c07-aca2-334f6137ed4b-kube-api-access-kgzdz\") pod \"auto-csr-approver-29565674-kcgq5\" (UID: \"e4012bc5-f224-4c07-aca2-334f6137ed4b\") " pod="openshift-infra/auto-csr-approver-29565674-kcgq5" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.477194 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565674-kcgq5" Mar 19 17:14:00 crc kubenswrapper[4918]: I0319 17:14:00.988331 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565674-kcgq5"] Mar 19 17:14:01 crc kubenswrapper[4918]: I0319 17:14:01.001486 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565674-kcgq5" event={"ID":"e4012bc5-f224-4c07-aca2-334f6137ed4b","Type":"ContainerStarted","Data":"ee11f562952ba28c157f9d7feec92af33ca751775c4d12ab06e1e36d8bc46c48"} Mar 19 17:14:03 crc kubenswrapper[4918]: I0319 17:14:03.058086 4918 generic.go:334] "Generic (PLEG): container finished" podID="8e658b37-8529-4e6f-adbc-0974b7957e57" containerID="3d4af2bd6bfa75336e4ea5530907991f4d8f85155e45abea2e26434b44b9e018" exitCode=0 Mar 19 17:14:03 crc kubenswrapper[4918]: I0319 17:14:03.058136 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" event={"ID":"8e658b37-8529-4e6f-adbc-0974b7957e57","Type":"ContainerDied","Data":"3d4af2bd6bfa75336e4ea5530907991f4d8f85155e45abea2e26434b44b9e018"} Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.071080 4918 generic.go:334] "Generic (PLEG): container finished" podID="e4012bc5-f224-4c07-aca2-334f6137ed4b" containerID="3aeef489e324ffeb7922cd7d0b4867896990f6941942a572d14c42e058b43290" exitCode=0 Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.071198 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565674-kcgq5" event={"ID":"e4012bc5-f224-4c07-aca2-334f6137ed4b","Type":"ContainerDied","Data":"3aeef489e324ffeb7922cd7d0b4867896990f6941942a572d14c42e058b43290"} Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.623197 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.693184 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwn8g\" (UniqueName: \"kubernetes.io/projected/8e658b37-8529-4e6f-adbc-0974b7957e57-kube-api-access-xwn8g\") pod \"8e658b37-8529-4e6f-adbc-0974b7957e57\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.693235 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-ssh-key-openstack-edpm-ipam\") pod \"8e658b37-8529-4e6f-adbc-0974b7957e57\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.693331 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-inventory\") pod \"8e658b37-8529-4e6f-adbc-0974b7957e57\" (UID: \"8e658b37-8529-4e6f-adbc-0974b7957e57\") " Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.698863 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e658b37-8529-4e6f-adbc-0974b7957e57-kube-api-access-xwn8g" (OuterVolumeSpecName: "kube-api-access-xwn8g") pod "8e658b37-8529-4e6f-adbc-0974b7957e57" (UID: "8e658b37-8529-4e6f-adbc-0974b7957e57"). InnerVolumeSpecName "kube-api-access-xwn8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.721657 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-inventory" (OuterVolumeSpecName: "inventory") pod "8e658b37-8529-4e6f-adbc-0974b7957e57" (UID: "8e658b37-8529-4e6f-adbc-0974b7957e57"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.744605 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8e658b37-8529-4e6f-adbc-0974b7957e57" (UID: "8e658b37-8529-4e6f-adbc-0974b7957e57"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.752764 4918 scope.go:117] "RemoveContainer" containerID="6dea809c472c381d224dc1a3f8c04322c480ea79674d6d778a8bb2a9b3ced58d" Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.795831 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.795867 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e658b37-8529-4e6f-adbc-0974b7957e57-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:04 crc kubenswrapper[4918]: I0319 17:14:04.795876 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwn8g\" (UniqueName: \"kubernetes.io/projected/8e658b37-8529-4e6f-adbc-0974b7957e57-kube-api-access-xwn8g\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.088051 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.088067 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l" event={"ID":"8e658b37-8529-4e6f-adbc-0974b7957e57","Type":"ContainerDied","Data":"b0805682f2ce1d08092956138de41094b2f0d9973a58f8ffcb94a93729a6b5d1"} Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.088600 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0805682f2ce1d08092956138de41094b2f0d9973a58f8ffcb94a93729a6b5d1" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.266129 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pcjkk"] Mar 19 17:14:05 crc kubenswrapper[4918]: E0319 17:14:05.266655 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e658b37-8529-4e6f-adbc-0974b7957e57" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.266675 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e658b37-8529-4e6f-adbc-0974b7957e57" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.266874 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e658b37-8529-4e6f-adbc-0974b7957e57" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.267632 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.270356 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.270507 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.270373 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.271962 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.286402 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pcjkk"] Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.408429 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvcxw\" (UniqueName: \"kubernetes.io/projected/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-kube-api-access-jvcxw\") pod \"ssh-known-hosts-edpm-deployment-pcjkk\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.408549 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pcjkk\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.408848 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pcjkk\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.434972 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565674-kcgq5" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.510214 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgzdz\" (UniqueName: \"kubernetes.io/projected/e4012bc5-f224-4c07-aca2-334f6137ed4b-kube-api-access-kgzdz\") pod \"e4012bc5-f224-4c07-aca2-334f6137ed4b\" (UID: \"e4012bc5-f224-4c07-aca2-334f6137ed4b\") " Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.510820 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pcjkk\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.510916 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvcxw\" (UniqueName: \"kubernetes.io/projected/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-kube-api-access-jvcxw\") pod \"ssh-known-hosts-edpm-deployment-pcjkk\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.510978 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pcjkk\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.515288 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pcjkk\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.515919 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4012bc5-f224-4c07-aca2-334f6137ed4b-kube-api-access-kgzdz" (OuterVolumeSpecName: "kube-api-access-kgzdz") pod "e4012bc5-f224-4c07-aca2-334f6137ed4b" (UID: "e4012bc5-f224-4c07-aca2-334f6137ed4b"). InnerVolumeSpecName "kube-api-access-kgzdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.517887 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pcjkk\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.535161 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvcxw\" (UniqueName: \"kubernetes.io/projected/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-kube-api-access-jvcxw\") pod \"ssh-known-hosts-edpm-deployment-pcjkk\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.586894 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.613414 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgzdz\" (UniqueName: \"kubernetes.io/projected/e4012bc5-f224-4c07-aca2-334f6137ed4b-kube-api-access-kgzdz\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.728750 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rh2fp"] Mar 19 17:14:05 crc kubenswrapper[4918]: E0319 17:14:05.729684 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4012bc5-f224-4c07-aca2-334f6137ed4b" containerName="oc" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.729708 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4012bc5-f224-4c07-aca2-334f6137ed4b" containerName="oc" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.729960 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4012bc5-f224-4c07-aca2-334f6137ed4b" containerName="oc" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.731822 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.740841 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rh2fp"] Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.829510 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-catalog-content\") pod \"certified-operators-rh2fp\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.829618 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m87lx\" (UniqueName: \"kubernetes.io/projected/86137512-bbf8-4f15-8aa4-00e1dfce59de-kube-api-access-m87lx\") pod \"certified-operators-rh2fp\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.829833 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-utilities\") pod \"certified-operators-rh2fp\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.931334 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-utilities\") pod \"certified-operators-rh2fp\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.931412 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-catalog-content\") pod \"certified-operators-rh2fp\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.931451 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m87lx\" (UniqueName: \"kubernetes.io/projected/86137512-bbf8-4f15-8aa4-00e1dfce59de-kube-api-access-m87lx\") pod \"certified-operators-rh2fp\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.931922 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-utilities\") pod \"certified-operators-rh2fp\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.931999 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-catalog-content\") pod \"certified-operators-rh2fp\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:05 crc kubenswrapper[4918]: I0319 17:14:05.957313 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m87lx\" (UniqueName: \"kubernetes.io/projected/86137512-bbf8-4f15-8aa4-00e1dfce59de-kube-api-access-m87lx\") pod \"certified-operators-rh2fp\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:06 crc kubenswrapper[4918]: I0319 17:14:06.058430 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:06 crc kubenswrapper[4918]: I0319 17:14:06.118959 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565674-kcgq5" event={"ID":"e4012bc5-f224-4c07-aca2-334f6137ed4b","Type":"ContainerDied","Data":"ee11f562952ba28c157f9d7feec92af33ca751775c4d12ab06e1e36d8bc46c48"} Mar 19 17:14:06 crc kubenswrapper[4918]: I0319 17:14:06.119299 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee11f562952ba28c157f9d7feec92af33ca751775c4d12ab06e1e36d8bc46c48" Mar 19 17:14:06 crc kubenswrapper[4918]: I0319 17:14:06.119012 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565674-kcgq5" Mar 19 17:14:06 crc kubenswrapper[4918]: I0319 17:14:06.347571 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pcjkk"] Mar 19 17:14:06 crc kubenswrapper[4918]: I0319 17:14:06.521116 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565668-4vsh5"] Mar 19 17:14:06 crc kubenswrapper[4918]: I0319 17:14:06.533616 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565668-4vsh5"] Mar 19 17:14:06 crc kubenswrapper[4918]: I0319 17:14:06.603121 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1975654a-8c6b-4bbd-a044-451ca4fa9412" path="/var/lib/kubelet/pods/1975654a-8c6b-4bbd-a044-451ca4fa9412/volumes" Mar 19 17:14:06 crc kubenswrapper[4918]: I0319 17:14:06.621664 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rh2fp"] Mar 19 17:14:06 crc kubenswrapper[4918]: E0319 17:14:06.652118 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest: reading manifest latest in quay.io/openstack-k8s-operators/openstack-ansibleee-runner: received unexpected HTTP status: 502 Bad Gateway" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Mar 19 17:14:06 crc kubenswrapper[4918]: E0319 17:14:06.652268 4918 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 17:14:06 crc kubenswrapper[4918]: container &Container{Name:ssh-known-hosts-edpm-deployment,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p osp.edpm.ssh_known_hosts -i ssh-known-hosts-edpm-deployment],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Mar 19 17:14:06 crc kubenswrapper[4918]: osp.edpm.ssh_known_hosts Mar 19 17:14:06 crc kubenswrapper[4918]: Mar 19 17:14:06 crc kubenswrapper[4918]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Mar 19 17:14:06 crc kubenswrapper[4918]: edpm_override_hosts: all Mar 19 17:14:06 crc kubenswrapper[4918]: edpm_service_type: ssh-known-hosts Mar 19 17:14:06 crc kubenswrapper[4918]: Mar 19 17:14:06 crc kubenswrapper[4918]: Mar 19 17:14:06 crc kubenswrapper[4918]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ssh-key-openstack-edpm-ipam,ReadOnly:false,MountPath:/runner/env/ssh_key/ssh_key_openstack-edpm-ipam,SubPath:ssh_key_openstack-edpm-ipam,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory-0,ReadOnly:false,MountPath:/runner/inventory/inventory-0,SubPath:inventory-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jvcxw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ssh-known-hosts-edpm-deployment-pcjkk_openstack(8ce6590d-02e8-4f83-aa5a-0f328daf1c1e): ErrImagePull: initializing source docker://quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest: reading manifest latest in quay.io/openstack-k8s-operators/openstack-ansibleee-runner: received unexpected HTTP status: 502 Bad Gateway Mar 19 17:14:06 crc kubenswrapper[4918]: > logger="UnhandledError" Mar 19 17:14:06 crc kubenswrapper[4918]: E0319 17:14:06.653414 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ssh-known-hosts-edpm-deployment\" with ErrImagePull: \"initializing source docker://quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest: reading manifest latest in quay.io/openstack-k8s-operators/openstack-ansibleee-runner: received unexpected HTTP status: 502 Bad Gateway\"" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" podUID="8ce6590d-02e8-4f83-aa5a-0f328daf1c1e" Mar 19 17:14:07 crc kubenswrapper[4918]: I0319 17:14:07.134897 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" event={"ID":"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e","Type":"ContainerStarted","Data":"d66fdaf8bcd066865f8af68732805039d639d388f927aad2ddd59ce53fecbba0"} Mar 19 17:14:07 crc kubenswrapper[4918]: I0319 17:14:07.136886 4918 generic.go:334] "Generic (PLEG): container finished" podID="86137512-bbf8-4f15-8aa4-00e1dfce59de" containerID="63404e43e97b5c9f3d09062ef7fe53b649833a255aae24096a06e91f685c6b00" exitCode=0 Mar 19 17:14:07 crc kubenswrapper[4918]: I0319 17:14:07.136983 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rh2fp" event={"ID":"86137512-bbf8-4f15-8aa4-00e1dfce59de","Type":"ContainerDied","Data":"63404e43e97b5c9f3d09062ef7fe53b649833a255aae24096a06e91f685c6b00"} Mar 19 17:14:07 crc kubenswrapper[4918]: I0319 17:14:07.137060 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rh2fp" event={"ID":"86137512-bbf8-4f15-8aa4-00e1dfce59de","Type":"ContainerStarted","Data":"39d025cd9397e533df0c7b90c14f0df4f2a45d95bd7eca933c8bee9e346baedf"} Mar 19 17:14:07 crc kubenswrapper[4918]: E0319 17:14:07.140222 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ssh-known-hosts-edpm-deployment\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" podUID="8ce6590d-02e8-4f83-aa5a-0f328daf1c1e" Mar 19 17:14:08 crc kubenswrapper[4918]: E0319 17:14:08.151199 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ssh-known-hosts-edpm-deployment\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" podUID="8ce6590d-02e8-4f83-aa5a-0f328daf1c1e" Mar 19 17:14:09 crc kubenswrapper[4918]: I0319 17:14:09.159612 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rh2fp" event={"ID":"86137512-bbf8-4f15-8aa4-00e1dfce59de","Type":"ContainerStarted","Data":"51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374"} Mar 19 17:14:10 crc kubenswrapper[4918]: I0319 17:14:10.168858 4918 generic.go:334] "Generic (PLEG): container finished" podID="86137512-bbf8-4f15-8aa4-00e1dfce59de" containerID="51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374" exitCode=0 Mar 19 17:14:10 crc kubenswrapper[4918]: I0319 17:14:10.168900 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rh2fp" event={"ID":"86137512-bbf8-4f15-8aa4-00e1dfce59de","Type":"ContainerDied","Data":"51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374"} Mar 19 17:14:11 crc kubenswrapper[4918]: I0319 17:14:11.182413 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rh2fp" event={"ID":"86137512-bbf8-4f15-8aa4-00e1dfce59de","Type":"ContainerStarted","Data":"417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468"} Mar 19 17:14:11 crc kubenswrapper[4918]: I0319 17:14:11.209083 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rh2fp" podStartSLOduration=2.6918683850000003 podStartE2EDuration="6.209055752s" podCreationTimestamp="2026-03-19 17:14:05 +0000 UTC" firstStartedPulling="2026-03-19 17:14:07.140123093 +0000 UTC m=+2059.262322371" lastFinishedPulling="2026-03-19 17:14:10.65731048 +0000 UTC m=+2062.779509738" observedRunningTime="2026-03-19 17:14:11.204168898 +0000 UTC m=+2063.326368176" watchObservedRunningTime="2026-03-19 17:14:11.209055752 +0000 UTC m=+2063.331255010" Mar 19 17:14:16 crc kubenswrapper[4918]: I0319 17:14:16.059023 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:16 crc kubenswrapper[4918]: I0319 17:14:16.059679 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:16 crc kubenswrapper[4918]: I0319 17:14:16.121128 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:16 crc kubenswrapper[4918]: I0319 17:14:16.274087 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:16 crc kubenswrapper[4918]: I0319 17:14:16.362717 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rh2fp"] Mar 19 17:14:18 crc kubenswrapper[4918]: I0319 17:14:18.252617 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rh2fp" podUID="86137512-bbf8-4f15-8aa4-00e1dfce59de" containerName="registry-server" containerID="cri-o://417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468" gracePeriod=2 Mar 19 17:14:18 crc kubenswrapper[4918]: I0319 17:14:18.813841 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:18 crc kubenswrapper[4918]: I0319 17:14:18.831932 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-utilities\") pod \"86137512-bbf8-4f15-8aa4-00e1dfce59de\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " Mar 19 17:14:18 crc kubenswrapper[4918]: I0319 17:14:18.832158 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-catalog-content\") pod \"86137512-bbf8-4f15-8aa4-00e1dfce59de\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " Mar 19 17:14:18 crc kubenswrapper[4918]: I0319 17:14:18.832202 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m87lx\" (UniqueName: \"kubernetes.io/projected/86137512-bbf8-4f15-8aa4-00e1dfce59de-kube-api-access-m87lx\") pod \"86137512-bbf8-4f15-8aa4-00e1dfce59de\" (UID: \"86137512-bbf8-4f15-8aa4-00e1dfce59de\") " Mar 19 17:14:18 crc kubenswrapper[4918]: I0319 17:14:18.832787 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-utilities" (OuterVolumeSpecName: "utilities") pod "86137512-bbf8-4f15-8aa4-00e1dfce59de" (UID: "86137512-bbf8-4f15-8aa4-00e1dfce59de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:14:18 crc kubenswrapper[4918]: I0319 17:14:18.838366 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86137512-bbf8-4f15-8aa4-00e1dfce59de-kube-api-access-m87lx" (OuterVolumeSpecName: "kube-api-access-m87lx") pod "86137512-bbf8-4f15-8aa4-00e1dfce59de" (UID: "86137512-bbf8-4f15-8aa4-00e1dfce59de"). InnerVolumeSpecName "kube-api-access-m87lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:14:18 crc kubenswrapper[4918]: I0319 17:14:18.895872 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86137512-bbf8-4f15-8aa4-00e1dfce59de" (UID: "86137512-bbf8-4f15-8aa4-00e1dfce59de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:14:18 crc kubenswrapper[4918]: I0319 17:14:18.934657 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:18 crc kubenswrapper[4918]: I0319 17:14:18.934707 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m87lx\" (UniqueName: \"kubernetes.io/projected/86137512-bbf8-4f15-8aa4-00e1dfce59de-kube-api-access-m87lx\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:18 crc kubenswrapper[4918]: I0319 17:14:18.934724 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86137512-bbf8-4f15-8aa4-00e1dfce59de-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.269165 4918 generic.go:334] "Generic (PLEG): container finished" podID="86137512-bbf8-4f15-8aa4-00e1dfce59de" containerID="417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468" exitCode=0 Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.269207 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rh2fp" event={"ID":"86137512-bbf8-4f15-8aa4-00e1dfce59de","Type":"ContainerDied","Data":"417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468"} Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.269235 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rh2fp" event={"ID":"86137512-bbf8-4f15-8aa4-00e1dfce59de","Type":"ContainerDied","Data":"39d025cd9397e533df0c7b90c14f0df4f2a45d95bd7eca933c8bee9e346baedf"} Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.269252 4918 scope.go:117] "RemoveContainer" containerID="417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468" Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.269247 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rh2fp" Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.292731 4918 scope.go:117] "RemoveContainer" containerID="51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374" Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.307572 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rh2fp"] Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.317282 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rh2fp"] Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.323886 4918 scope.go:117] "RemoveContainer" containerID="63404e43e97b5c9f3d09062ef7fe53b649833a255aae24096a06e91f685c6b00" Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.437683 4918 scope.go:117] "RemoveContainer" containerID="417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468" Mar 19 17:14:19 crc kubenswrapper[4918]: E0319 17:14:19.438266 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468\": container with ID starting with 417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468 not found: ID does not exist" containerID="417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468" Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.438300 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468"} err="failed to get container status \"417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468\": rpc error: code = NotFound desc = could not find container \"417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468\": container with ID starting with 417c9aeab7b48cc617272e1d4e98d955632ec78c877246c0b1b6da8c657e9468 not found: ID does not exist" Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.438322 4918 scope.go:117] "RemoveContainer" containerID="51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374" Mar 19 17:14:19 crc kubenswrapper[4918]: E0319 17:14:19.438716 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374\": container with ID starting with 51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374 not found: ID does not exist" containerID="51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374" Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.438740 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374"} err="failed to get container status \"51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374\": rpc error: code = NotFound desc = could not find container \"51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374\": container with ID starting with 51360c37988ab2eada0c83e67beed6cb16e6776301d2f8bea26b225bdaae4374 not found: ID does not exist" Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.438754 4918 scope.go:117] "RemoveContainer" containerID="63404e43e97b5c9f3d09062ef7fe53b649833a255aae24096a06e91f685c6b00" Mar 19 17:14:19 crc kubenswrapper[4918]: E0319 17:14:19.439586 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63404e43e97b5c9f3d09062ef7fe53b649833a255aae24096a06e91f685c6b00\": container with ID starting with 63404e43e97b5c9f3d09062ef7fe53b649833a255aae24096a06e91f685c6b00 not found: ID does not exist" containerID="63404e43e97b5c9f3d09062ef7fe53b649833a255aae24096a06e91f685c6b00" Mar 19 17:14:19 crc kubenswrapper[4918]: I0319 17:14:19.439609 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63404e43e97b5c9f3d09062ef7fe53b649833a255aae24096a06e91f685c6b00"} err="failed to get container status \"63404e43e97b5c9f3d09062ef7fe53b649833a255aae24096a06e91f685c6b00\": rpc error: code = NotFound desc = could not find container \"63404e43e97b5c9f3d09062ef7fe53b649833a255aae24096a06e91f685c6b00\": container with ID starting with 63404e43e97b5c9f3d09062ef7fe53b649833a255aae24096a06e91f685c6b00 not found: ID does not exist" Mar 19 17:14:20 crc kubenswrapper[4918]: I0319 17:14:20.596925 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86137512-bbf8-4f15-8aa4-00e1dfce59de" path="/var/lib/kubelet/pods/86137512-bbf8-4f15-8aa4-00e1dfce59de/volumes" Mar 19 17:14:21 crc kubenswrapper[4918]: I0319 17:14:21.289155 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" event={"ID":"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e","Type":"ContainerStarted","Data":"d2a6b702142dde6eb1a41ce93ddb4281509cbb1d8fc0602648a9eef7f31d64e6"} Mar 19 17:14:21 crc kubenswrapper[4918]: I0319 17:14:21.325464 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" podStartSLOduration=2.400513378 podStartE2EDuration="16.325444327s" podCreationTimestamp="2026-03-19 17:14:05 +0000 UTC" firstStartedPulling="2026-03-19 17:14:06.340191528 +0000 UTC m=+2058.462390776" lastFinishedPulling="2026-03-19 17:14:20.265122477 +0000 UTC m=+2072.387321725" observedRunningTime="2026-03-19 17:14:21.306170739 +0000 UTC m=+2073.428369987" watchObservedRunningTime="2026-03-19 17:14:21.325444327 +0000 UTC m=+2073.447643585" Mar 19 17:14:27 crc kubenswrapper[4918]: I0319 17:14:27.349264 4918 generic.go:334] "Generic (PLEG): container finished" podID="8ce6590d-02e8-4f83-aa5a-0f328daf1c1e" containerID="d2a6b702142dde6eb1a41ce93ddb4281509cbb1d8fc0602648a9eef7f31d64e6" exitCode=0 Mar 19 17:14:27 crc kubenswrapper[4918]: I0319 17:14:27.349325 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" event={"ID":"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e","Type":"ContainerDied","Data":"d2a6b702142dde6eb1a41ce93ddb4281509cbb1d8fc0602648a9eef7f31d64e6"} Mar 19 17:14:28 crc kubenswrapper[4918]: I0319 17:14:28.211764 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:14:28 crc kubenswrapper[4918]: I0319 17:14:28.211845 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:14:28 crc kubenswrapper[4918]: I0319 17:14:28.211912 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 17:14:28 crc kubenswrapper[4918]: I0319 17:14:28.213147 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa72b2dda22c6725c1ba0de7800b932e612ef8adf06d0be5f45d4b0d25a364f6"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 17:14:28 crc kubenswrapper[4918]: I0319 17:14:28.213274 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://fa72b2dda22c6725c1ba0de7800b932e612ef8adf06d0be5f45d4b0d25a364f6" gracePeriod=600 Mar 19 17:14:28 crc kubenswrapper[4918]: I0319 17:14:28.379358 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="fa72b2dda22c6725c1ba0de7800b932e612ef8adf06d0be5f45d4b0d25a364f6" exitCode=0 Mar 19 17:14:28 crc kubenswrapper[4918]: I0319 17:14:28.379433 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"fa72b2dda22c6725c1ba0de7800b932e612ef8adf06d0be5f45d4b0d25a364f6"} Mar 19 17:14:28 crc kubenswrapper[4918]: I0319 17:14:28.379493 4918 scope.go:117] "RemoveContainer" containerID="de897422b0f495ea1e966fd0e478262b09b5d3d069201288c2a1442948d93440" Mar 19 17:14:28 crc kubenswrapper[4918]: I0319 17:14:28.975062 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.069513 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-ssh-key-openstack-edpm-ipam\") pod \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.069614 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvcxw\" (UniqueName: \"kubernetes.io/projected/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-kube-api-access-jvcxw\") pod \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.069919 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-inventory-0\") pod \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\" (UID: \"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e\") " Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.076874 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-kube-api-access-jvcxw" (OuterVolumeSpecName: "kube-api-access-jvcxw") pod "8ce6590d-02e8-4f83-aa5a-0f328daf1c1e" (UID: "8ce6590d-02e8-4f83-aa5a-0f328daf1c1e"). InnerVolumeSpecName "kube-api-access-jvcxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.103397 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8ce6590d-02e8-4f83-aa5a-0f328daf1c1e" (UID: "8ce6590d-02e8-4f83-aa5a-0f328daf1c1e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.110733 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "8ce6590d-02e8-4f83-aa5a-0f328daf1c1e" (UID: "8ce6590d-02e8-4f83-aa5a-0f328daf1c1e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.172163 4918 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.172200 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.172214 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvcxw\" (UniqueName: \"kubernetes.io/projected/8ce6590d-02e8-4f83-aa5a-0f328daf1c1e-kube-api-access-jvcxw\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.408382 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.408382 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pcjkk" event={"ID":"8ce6590d-02e8-4f83-aa5a-0f328daf1c1e","Type":"ContainerDied","Data":"d66fdaf8bcd066865f8af68732805039d639d388f927aad2ddd59ce53fecbba0"} Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.409229 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d66fdaf8bcd066865f8af68732805039d639d388f927aad2ddd59ce53fecbba0" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.414057 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74"} Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.469174 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh"] Mar 19 17:14:29 crc kubenswrapper[4918]: E0319 17:14:29.469703 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86137512-bbf8-4f15-8aa4-00e1dfce59de" containerName="extract-utilities" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.469726 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="86137512-bbf8-4f15-8aa4-00e1dfce59de" containerName="extract-utilities" Mar 19 17:14:29 crc kubenswrapper[4918]: E0319 17:14:29.469753 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ce6590d-02e8-4f83-aa5a-0f328daf1c1e" containerName="ssh-known-hosts-edpm-deployment" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.469764 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ce6590d-02e8-4f83-aa5a-0f328daf1c1e" containerName="ssh-known-hosts-edpm-deployment" Mar 19 17:14:29 crc kubenswrapper[4918]: E0319 17:14:29.469776 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86137512-bbf8-4f15-8aa4-00e1dfce59de" containerName="registry-server" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.469785 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="86137512-bbf8-4f15-8aa4-00e1dfce59de" containerName="registry-server" Mar 19 17:14:29 crc kubenswrapper[4918]: E0319 17:14:29.469837 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86137512-bbf8-4f15-8aa4-00e1dfce59de" containerName="extract-content" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.469847 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="86137512-bbf8-4f15-8aa4-00e1dfce59de" containerName="extract-content" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.470093 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ce6590d-02e8-4f83-aa5a-0f328daf1c1e" containerName="ssh-known-hosts-edpm-deployment" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.470129 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="86137512-bbf8-4f15-8aa4-00e1dfce59de" containerName="registry-server" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.471121 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.473378 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.473704 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.473861 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.474020 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.478741 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55qcd\" (UniqueName: \"kubernetes.io/projected/9a03a82a-25da-4509-a94d-26b2c686f8f3-kube-api-access-55qcd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gn2zh\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.478805 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gn2zh\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.478873 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gn2zh\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.484470 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh"] Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.580906 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55qcd\" (UniqueName: \"kubernetes.io/projected/9a03a82a-25da-4509-a94d-26b2c686f8f3-kube-api-access-55qcd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gn2zh\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.580969 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gn2zh\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.581077 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gn2zh\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.586397 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gn2zh\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.586756 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gn2zh\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.601142 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55qcd\" (UniqueName: \"kubernetes.io/projected/9a03a82a-25da-4509-a94d-26b2c686f8f3-kube-api-access-55qcd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gn2zh\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.790810 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.795708 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-czwrc"] Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.797925 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.811636 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-czwrc"] Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.989006 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-utilities\") pod \"redhat-marketplace-czwrc\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.989056 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwftg\" (UniqueName: \"kubernetes.io/projected/744eeec3-bdd1-490d-ae28-836c0de3e295-kube-api-access-nwftg\") pod \"redhat-marketplace-czwrc\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:29 crc kubenswrapper[4918]: I0319 17:14:29.993514 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-catalog-content\") pod \"redhat-marketplace-czwrc\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:30 crc kubenswrapper[4918]: I0319 17:14:30.096569 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwftg\" (UniqueName: \"kubernetes.io/projected/744eeec3-bdd1-490d-ae28-836c0de3e295-kube-api-access-nwftg\") pod \"redhat-marketplace-czwrc\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:30 crc kubenswrapper[4918]: I0319 17:14:30.096743 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-catalog-content\") pod \"redhat-marketplace-czwrc\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:30 crc kubenswrapper[4918]: I0319 17:14:30.096864 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-utilities\") pod \"redhat-marketplace-czwrc\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:30 crc kubenswrapper[4918]: I0319 17:14:30.097463 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-utilities\") pod \"redhat-marketplace-czwrc\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:30 crc kubenswrapper[4918]: I0319 17:14:30.097821 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-catalog-content\") pod \"redhat-marketplace-czwrc\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:30 crc kubenswrapper[4918]: I0319 17:14:30.117886 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwftg\" (UniqueName: \"kubernetes.io/projected/744eeec3-bdd1-490d-ae28-836c0de3e295-kube-api-access-nwftg\") pod \"redhat-marketplace-czwrc\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:30 crc kubenswrapper[4918]: I0319 17:14:30.225069 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:30 crc kubenswrapper[4918]: I0319 17:14:30.443833 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh"] Mar 19 17:14:30 crc kubenswrapper[4918]: I0319 17:14:30.719260 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-czwrc"] Mar 19 17:14:30 crc kubenswrapper[4918]: W0319 17:14:30.723617 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod744eeec3_bdd1_490d_ae28_836c0de3e295.slice/crio-6b2457774d61602722970d82a1d0edd653879daf13f3ed15e2ef12f0b85fe453 WatchSource:0}: Error finding container 6b2457774d61602722970d82a1d0edd653879daf13f3ed15e2ef12f0b85fe453: Status 404 returned error can't find the container with id 6b2457774d61602722970d82a1d0edd653879daf13f3ed15e2ef12f0b85fe453 Mar 19 17:14:31 crc kubenswrapper[4918]: I0319 17:14:31.441315 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" event={"ID":"9a03a82a-25da-4509-a94d-26b2c686f8f3","Type":"ContainerStarted","Data":"6587fe904d143fbb8310d34289b7dab4652cee244535009b993ce958e392df6c"} Mar 19 17:14:31 crc kubenswrapper[4918]: I0319 17:14:31.441393 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" event={"ID":"9a03a82a-25da-4509-a94d-26b2c686f8f3","Type":"ContainerStarted","Data":"f7a5312bc14ee104db6149a1d0342917acd4055843d9b38297279ed025a662d8"} Mar 19 17:14:31 crc kubenswrapper[4918]: I0319 17:14:31.443710 4918 generic.go:334] "Generic (PLEG): container finished" podID="744eeec3-bdd1-490d-ae28-836c0de3e295" containerID="5b364890d6202a07ed4b95066154d12e0f5441ca74d56eaf435564f13a8f8d1e" exitCode=0 Mar 19 17:14:31 crc kubenswrapper[4918]: I0319 17:14:31.443775 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czwrc" event={"ID":"744eeec3-bdd1-490d-ae28-836c0de3e295","Type":"ContainerDied","Data":"5b364890d6202a07ed4b95066154d12e0f5441ca74d56eaf435564f13a8f8d1e"} Mar 19 17:14:31 crc kubenswrapper[4918]: I0319 17:14:31.443807 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czwrc" event={"ID":"744eeec3-bdd1-490d-ae28-836c0de3e295","Type":"ContainerStarted","Data":"6b2457774d61602722970d82a1d0edd653879daf13f3ed15e2ef12f0b85fe453"} Mar 19 17:14:31 crc kubenswrapper[4918]: I0319 17:14:31.480647 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" podStartSLOduration=2.076954441 podStartE2EDuration="2.480626185s" podCreationTimestamp="2026-03-19 17:14:29 +0000 UTC" firstStartedPulling="2026-03-19 17:14:30.449742001 +0000 UTC m=+2082.571941249" lastFinishedPulling="2026-03-19 17:14:30.853413745 +0000 UTC m=+2082.975612993" observedRunningTime="2026-03-19 17:14:31.462152049 +0000 UTC m=+2083.584351297" watchObservedRunningTime="2026-03-19 17:14:31.480626185 +0000 UTC m=+2083.602825433" Mar 19 17:14:33 crc kubenswrapper[4918]: I0319 17:14:33.466299 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czwrc" event={"ID":"744eeec3-bdd1-490d-ae28-836c0de3e295","Type":"ContainerStarted","Data":"582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead"} Mar 19 17:14:34 crc kubenswrapper[4918]: I0319 17:14:34.477334 4918 generic.go:334] "Generic (PLEG): container finished" podID="744eeec3-bdd1-490d-ae28-836c0de3e295" containerID="582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead" exitCode=0 Mar 19 17:14:34 crc kubenswrapper[4918]: I0319 17:14:34.477387 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czwrc" event={"ID":"744eeec3-bdd1-490d-ae28-836c0de3e295","Type":"ContainerDied","Data":"582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead"} Mar 19 17:14:35 crc kubenswrapper[4918]: I0319 17:14:35.489081 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czwrc" event={"ID":"744eeec3-bdd1-490d-ae28-836c0de3e295","Type":"ContainerStarted","Data":"9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d"} Mar 19 17:14:35 crc kubenswrapper[4918]: I0319 17:14:35.525001 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-czwrc" podStartSLOduration=2.972158785 podStartE2EDuration="6.52497979s" podCreationTimestamp="2026-03-19 17:14:29 +0000 UTC" firstStartedPulling="2026-03-19 17:14:31.445894673 +0000 UTC m=+2083.568093921" lastFinishedPulling="2026-03-19 17:14:34.998715678 +0000 UTC m=+2087.120914926" observedRunningTime="2026-03-19 17:14:35.516330733 +0000 UTC m=+2087.638530021" watchObservedRunningTime="2026-03-19 17:14:35.52497979 +0000 UTC m=+2087.647179048" Mar 19 17:14:40 crc kubenswrapper[4918]: I0319 17:14:40.226369 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:40 crc kubenswrapper[4918]: I0319 17:14:40.227386 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:40 crc kubenswrapper[4918]: I0319 17:14:40.297427 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:40 crc kubenswrapper[4918]: I0319 17:14:40.555751 4918 generic.go:334] "Generic (PLEG): container finished" podID="9a03a82a-25da-4509-a94d-26b2c686f8f3" containerID="6587fe904d143fbb8310d34289b7dab4652cee244535009b993ce958e392df6c" exitCode=0 Mar 19 17:14:40 crc kubenswrapper[4918]: I0319 17:14:40.555835 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" event={"ID":"9a03a82a-25da-4509-a94d-26b2c686f8f3","Type":"ContainerDied","Data":"6587fe904d143fbb8310d34289b7dab4652cee244535009b993ce958e392df6c"} Mar 19 17:14:40 crc kubenswrapper[4918]: I0319 17:14:40.634312 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.155246 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.178151 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-czwrc"] Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.274399 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-inventory\") pod \"9a03a82a-25da-4509-a94d-26b2c686f8f3\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.274862 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55qcd\" (UniqueName: \"kubernetes.io/projected/9a03a82a-25da-4509-a94d-26b2c686f8f3-kube-api-access-55qcd\") pod \"9a03a82a-25da-4509-a94d-26b2c686f8f3\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.275055 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-ssh-key-openstack-edpm-ipam\") pod \"9a03a82a-25da-4509-a94d-26b2c686f8f3\" (UID: \"9a03a82a-25da-4509-a94d-26b2c686f8f3\") " Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.280790 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a03a82a-25da-4509-a94d-26b2c686f8f3-kube-api-access-55qcd" (OuterVolumeSpecName: "kube-api-access-55qcd") pod "9a03a82a-25da-4509-a94d-26b2c686f8f3" (UID: "9a03a82a-25da-4509-a94d-26b2c686f8f3"). InnerVolumeSpecName "kube-api-access-55qcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.310280 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-inventory" (OuterVolumeSpecName: "inventory") pod "9a03a82a-25da-4509-a94d-26b2c686f8f3" (UID: "9a03a82a-25da-4509-a94d-26b2c686f8f3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.329512 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9a03a82a-25da-4509-a94d-26b2c686f8f3" (UID: "9a03a82a-25da-4509-a94d-26b2c686f8f3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.377609 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.377642 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55qcd\" (UniqueName: \"kubernetes.io/projected/9a03a82a-25da-4509-a94d-26b2c686f8f3-kube-api-access-55qcd\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.377653 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a03a82a-25da-4509-a94d-26b2c686f8f3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.575226 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" event={"ID":"9a03a82a-25da-4509-a94d-26b2c686f8f3","Type":"ContainerDied","Data":"f7a5312bc14ee104db6149a1d0342917acd4055843d9b38297279ed025a662d8"} Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.575285 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7a5312bc14ee104db6149a1d0342917acd4055843d9b38297279ed025a662d8" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.575329 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gn2zh" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.575360 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-czwrc" podUID="744eeec3-bdd1-490d-ae28-836c0de3e295" containerName="registry-server" containerID="cri-o://9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d" gracePeriod=2 Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.696807 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw"] Mar 19 17:14:42 crc kubenswrapper[4918]: E0319 17:14:42.697305 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a03a82a-25da-4509-a94d-26b2c686f8f3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.697323 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a03a82a-25da-4509-a94d-26b2c686f8f3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.697532 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a03a82a-25da-4509-a94d-26b2c686f8f3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.698326 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.704874 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.704933 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.705839 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.706438 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.725977 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw"] Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.796369 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.798211 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.798344 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65rmt\" (UniqueName: \"kubernetes.io/projected/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-kube-api-access-65rmt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.900897 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.901367 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65rmt\" (UniqueName: \"kubernetes.io/projected/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-kube-api-access-65rmt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.901459 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.908469 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.911091 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:42 crc kubenswrapper[4918]: I0319 17:14:42.927186 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65rmt\" (UniqueName: \"kubernetes.io/projected/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-kube-api-access-65rmt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.053083 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.186024 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.217147 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-utilities\") pod \"744eeec3-bdd1-490d-ae28-836c0de3e295\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.217238 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwftg\" (UniqueName: \"kubernetes.io/projected/744eeec3-bdd1-490d-ae28-836c0de3e295-kube-api-access-nwftg\") pod \"744eeec3-bdd1-490d-ae28-836c0de3e295\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.217408 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-catalog-content\") pod \"744eeec3-bdd1-490d-ae28-836c0de3e295\" (UID: \"744eeec3-bdd1-490d-ae28-836c0de3e295\") " Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.219504 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-utilities" (OuterVolumeSpecName: "utilities") pod "744eeec3-bdd1-490d-ae28-836c0de3e295" (UID: "744eeec3-bdd1-490d-ae28-836c0de3e295"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.229819 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/744eeec3-bdd1-490d-ae28-836c0de3e295-kube-api-access-nwftg" (OuterVolumeSpecName: "kube-api-access-nwftg") pod "744eeec3-bdd1-490d-ae28-836c0de3e295" (UID: "744eeec3-bdd1-490d-ae28-836c0de3e295"). InnerVolumeSpecName "kube-api-access-nwftg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.255469 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "744eeec3-bdd1-490d-ae28-836c0de3e295" (UID: "744eeec3-bdd1-490d-ae28-836c0de3e295"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.320216 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.320247 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwftg\" (UniqueName: \"kubernetes.io/projected/744eeec3-bdd1-490d-ae28-836c0de3e295-kube-api-access-nwftg\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.320258 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/744eeec3-bdd1-490d-ae28-836c0de3e295-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.587853 4918 generic.go:334] "Generic (PLEG): container finished" podID="744eeec3-bdd1-490d-ae28-836c0de3e295" containerID="9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d" exitCode=0 Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.587900 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czwrc" event={"ID":"744eeec3-bdd1-490d-ae28-836c0de3e295","Type":"ContainerDied","Data":"9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d"} Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.587905 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-czwrc" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.587964 4918 scope.go:117] "RemoveContainer" containerID="9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.587952 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-czwrc" event={"ID":"744eeec3-bdd1-490d-ae28-836c0de3e295","Type":"ContainerDied","Data":"6b2457774d61602722970d82a1d0edd653879daf13f3ed15e2ef12f0b85fe453"} Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.610463 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw"] Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.620497 4918 scope.go:117] "RemoveContainer" containerID="582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead" Mar 19 17:14:43 crc kubenswrapper[4918]: W0319 17:14:43.627468 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81bba2ec_df16_4e0c_8afd_8e3e872f17fa.slice/crio-3765aa026c2e15ecff4bdf65c207762d8e2ed06d32e296f07049857f8f613f1e WatchSource:0}: Error finding container 3765aa026c2e15ecff4bdf65c207762d8e2ed06d32e296f07049857f8f613f1e: Status 404 returned error can't find the container with id 3765aa026c2e15ecff4bdf65c207762d8e2ed06d32e296f07049857f8f613f1e Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.631785 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-czwrc"] Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.641035 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-czwrc"] Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.686977 4918 scope.go:117] "RemoveContainer" containerID="5b364890d6202a07ed4b95066154d12e0f5441ca74d56eaf435564f13a8f8d1e" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.733060 4918 scope.go:117] "RemoveContainer" containerID="9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d" Mar 19 17:14:43 crc kubenswrapper[4918]: E0319 17:14:43.733543 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d\": container with ID starting with 9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d not found: ID does not exist" containerID="9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.733586 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d"} err="failed to get container status \"9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d\": rpc error: code = NotFound desc = could not find container \"9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d\": container with ID starting with 9f6147df75681476a40ef555bbcd97e0027aa201ccb6f99389046d3121892d1d not found: ID does not exist" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.733610 4918 scope.go:117] "RemoveContainer" containerID="582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead" Mar 19 17:14:43 crc kubenswrapper[4918]: E0319 17:14:43.734038 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead\": container with ID starting with 582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead not found: ID does not exist" containerID="582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.734071 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead"} err="failed to get container status \"582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead\": rpc error: code = NotFound desc = could not find container \"582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead\": container with ID starting with 582136f7c59ea2ed8aff72e17cb7bdab97edb3bf6c0a3ede05ba3c1c4fcaaead not found: ID does not exist" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.734092 4918 scope.go:117] "RemoveContainer" containerID="5b364890d6202a07ed4b95066154d12e0f5441ca74d56eaf435564f13a8f8d1e" Mar 19 17:14:43 crc kubenswrapper[4918]: E0319 17:14:43.734538 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b364890d6202a07ed4b95066154d12e0f5441ca74d56eaf435564f13a8f8d1e\": container with ID starting with 5b364890d6202a07ed4b95066154d12e0f5441ca74d56eaf435564f13a8f8d1e not found: ID does not exist" containerID="5b364890d6202a07ed4b95066154d12e0f5441ca74d56eaf435564f13a8f8d1e" Mar 19 17:14:43 crc kubenswrapper[4918]: I0319 17:14:43.734560 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b364890d6202a07ed4b95066154d12e0f5441ca74d56eaf435564f13a8f8d1e"} err="failed to get container status \"5b364890d6202a07ed4b95066154d12e0f5441ca74d56eaf435564f13a8f8d1e\": rpc error: code = NotFound desc = could not find container \"5b364890d6202a07ed4b95066154d12e0f5441ca74d56eaf435564f13a8f8d1e\": container with ID starting with 5b364890d6202a07ed4b95066154d12e0f5441ca74d56eaf435564f13a8f8d1e not found: ID does not exist" Mar 19 17:14:44 crc kubenswrapper[4918]: I0319 17:14:44.597174 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="744eeec3-bdd1-490d-ae28-836c0de3e295" path="/var/lib/kubelet/pods/744eeec3-bdd1-490d-ae28-836c0de3e295/volumes" Mar 19 17:14:44 crc kubenswrapper[4918]: I0319 17:14:44.600324 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" event={"ID":"81bba2ec-df16-4e0c-8afd-8e3e872f17fa","Type":"ContainerStarted","Data":"52d505df517813ae531d35b81d829fdb03b81916b81fd7fc62b0d9928419b524"} Mar 19 17:14:44 crc kubenswrapper[4918]: I0319 17:14:44.600373 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" event={"ID":"81bba2ec-df16-4e0c-8afd-8e3e872f17fa","Type":"ContainerStarted","Data":"3765aa026c2e15ecff4bdf65c207762d8e2ed06d32e296f07049857f8f613f1e"} Mar 19 17:14:44 crc kubenswrapper[4918]: I0319 17:14:44.629831 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" podStartSLOduration=2.234056114 podStartE2EDuration="2.629809191s" podCreationTimestamp="2026-03-19 17:14:42 +0000 UTC" firstStartedPulling="2026-03-19 17:14:43.629926787 +0000 UTC m=+2095.752126035" lastFinishedPulling="2026-03-19 17:14:44.025679854 +0000 UTC m=+2096.147879112" observedRunningTime="2026-03-19 17:14:44.629194995 +0000 UTC m=+2096.751394243" watchObservedRunningTime="2026-03-19 17:14:44.629809191 +0000 UTC m=+2096.752008449" Mar 19 17:14:53 crc kubenswrapper[4918]: I0319 17:14:53.710970 4918 generic.go:334] "Generic (PLEG): container finished" podID="81bba2ec-df16-4e0c-8afd-8e3e872f17fa" containerID="52d505df517813ae531d35b81d829fdb03b81916b81fd7fc62b0d9928419b524" exitCode=0 Mar 19 17:14:53 crc kubenswrapper[4918]: I0319 17:14:53.711097 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" event={"ID":"81bba2ec-df16-4e0c-8afd-8e3e872f17fa","Type":"ContainerDied","Data":"52d505df517813ae531d35b81d829fdb03b81916b81fd7fc62b0d9928419b524"} Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.191975 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.385542 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-ssh-key-openstack-edpm-ipam\") pod \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.385628 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65rmt\" (UniqueName: \"kubernetes.io/projected/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-kube-api-access-65rmt\") pod \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.385762 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-inventory\") pod \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\" (UID: \"81bba2ec-df16-4e0c-8afd-8e3e872f17fa\") " Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.391630 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-kube-api-access-65rmt" (OuterVolumeSpecName: "kube-api-access-65rmt") pod "81bba2ec-df16-4e0c-8afd-8e3e872f17fa" (UID: "81bba2ec-df16-4e0c-8afd-8e3e872f17fa"). InnerVolumeSpecName "kube-api-access-65rmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.419260 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "81bba2ec-df16-4e0c-8afd-8e3e872f17fa" (UID: "81bba2ec-df16-4e0c-8afd-8e3e872f17fa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.427147 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-inventory" (OuterVolumeSpecName: "inventory") pod "81bba2ec-df16-4e0c-8afd-8e3e872f17fa" (UID: "81bba2ec-df16-4e0c-8afd-8e3e872f17fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.488727 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.488765 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.488781 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65rmt\" (UniqueName: \"kubernetes.io/projected/81bba2ec-df16-4e0c-8afd-8e3e872f17fa-kube-api-access-65rmt\") on node \"crc\" DevicePath \"\"" Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.729757 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" event={"ID":"81bba2ec-df16-4e0c-8afd-8e3e872f17fa","Type":"ContainerDied","Data":"3765aa026c2e15ecff4bdf65c207762d8e2ed06d32e296f07049857f8f613f1e"} Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.729816 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3765aa026c2e15ecff4bdf65c207762d8e2ed06d32e296f07049857f8f613f1e" Mar 19 17:14:55 crc kubenswrapper[4918]: I0319 17:14:55.729778 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.292826 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx"] Mar 19 17:14:56 crc kubenswrapper[4918]: E0319 17:14:56.293373 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="744eeec3-bdd1-490d-ae28-836c0de3e295" containerName="registry-server" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.293391 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="744eeec3-bdd1-490d-ae28-836c0de3e295" containerName="registry-server" Mar 19 17:14:56 crc kubenswrapper[4918]: E0319 17:14:56.293411 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="744eeec3-bdd1-490d-ae28-836c0de3e295" containerName="extract-content" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.293419 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="744eeec3-bdd1-490d-ae28-836c0de3e295" containerName="extract-content" Mar 19 17:14:56 crc kubenswrapper[4918]: E0319 17:14:56.293432 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81bba2ec-df16-4e0c-8afd-8e3e872f17fa" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.293442 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="81bba2ec-df16-4e0c-8afd-8e3e872f17fa" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:14:56 crc kubenswrapper[4918]: E0319 17:14:56.293481 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="744eeec3-bdd1-490d-ae28-836c0de3e295" containerName="extract-utilities" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.293489 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="744eeec3-bdd1-490d-ae28-836c0de3e295" containerName="extract-utilities" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.293755 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="81bba2ec-df16-4e0c-8afd-8e3e872f17fa" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.293769 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="744eeec3-bdd1-490d-ae28-836c0de3e295" containerName="registry-server" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.294499 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.298076 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.298362 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.298986 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.299271 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.299728 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.301038 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.301303 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.301725 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.306969 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307058 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307094 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307147 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307175 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307238 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307271 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307298 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307327 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307359 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307400 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307431 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307475 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqfk\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-kube-api-access-bdqfk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.307497 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.309234 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx"] Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.408887 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.409171 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.409357 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.409895 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.410060 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.410156 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.410242 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.410338 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.410460 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.410589 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqfk\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-kube-api-access-bdqfk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.410674 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.410788 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.410943 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.411377 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.413883 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.413925 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.414506 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.414785 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.414859 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.415163 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.415223 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.416993 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.418208 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.418413 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.418590 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.418890 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.419720 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.425996 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqfk\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-kube-api-access-bdqfk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:56 crc kubenswrapper[4918]: I0319 17:14:56.643435 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:14:57 crc kubenswrapper[4918]: I0319 17:14:57.787942 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx"] Mar 19 17:14:58 crc kubenswrapper[4918]: I0319 17:14:58.765349 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" event={"ID":"10b0ee10-0449-4ca7-bece-7942e3bf9f86","Type":"ContainerStarted","Data":"970cabc4d7bc2032e3b6ad1ddefed5ec5c35fe8950851c9d7f8b446760214ec3"} Mar 19 17:14:58 crc kubenswrapper[4918]: I0319 17:14:58.765760 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" event={"ID":"10b0ee10-0449-4ca7-bece-7942e3bf9f86","Type":"ContainerStarted","Data":"78c9bb87d748c490fbb12a79bc79576fa52ea72d8dcc20660501b2da3f845290"} Mar 19 17:14:58 crc kubenswrapper[4918]: I0319 17:14:58.800571 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" podStartSLOduration=2.362045708 podStartE2EDuration="2.800547356s" podCreationTimestamp="2026-03-19 17:14:56 +0000 UTC" firstStartedPulling="2026-03-19 17:14:57.799745847 +0000 UTC m=+2109.921945105" lastFinishedPulling="2026-03-19 17:14:58.238247475 +0000 UTC m=+2110.360446753" observedRunningTime="2026-03-19 17:14:58.79191428 +0000 UTC m=+2110.914113548" watchObservedRunningTime="2026-03-19 17:14:58.800547356 +0000 UTC m=+2110.922746604" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.140339 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf"] Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.142614 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.144962 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.145216 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.151857 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf"] Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.257379 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22f4bb2c-9cf8-4750-99ea-9d86d180343f-config-volume\") pod \"collect-profiles-29565675-d2fpf\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.257489 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22f4bb2c-9cf8-4750-99ea-9d86d180343f-secret-volume\") pod \"collect-profiles-29565675-d2fpf\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.257515 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qgl\" (UniqueName: \"kubernetes.io/projected/22f4bb2c-9cf8-4750-99ea-9d86d180343f-kube-api-access-z7qgl\") pod \"collect-profiles-29565675-d2fpf\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.360585 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22f4bb2c-9cf8-4750-99ea-9d86d180343f-config-volume\") pod \"collect-profiles-29565675-d2fpf\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.360884 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22f4bb2c-9cf8-4750-99ea-9d86d180343f-secret-volume\") pod \"collect-profiles-29565675-d2fpf\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.360967 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qgl\" (UniqueName: \"kubernetes.io/projected/22f4bb2c-9cf8-4750-99ea-9d86d180343f-kube-api-access-z7qgl\") pod \"collect-profiles-29565675-d2fpf\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.363035 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22f4bb2c-9cf8-4750-99ea-9d86d180343f-config-volume\") pod \"collect-profiles-29565675-d2fpf\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.379690 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22f4bb2c-9cf8-4750-99ea-9d86d180343f-secret-volume\") pod \"collect-profiles-29565675-d2fpf\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.384415 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qgl\" (UniqueName: \"kubernetes.io/projected/22f4bb2c-9cf8-4750-99ea-9d86d180343f-kube-api-access-z7qgl\") pod \"collect-profiles-29565675-d2fpf\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.469018 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:00 crc kubenswrapper[4918]: I0319 17:15:00.982884 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf"] Mar 19 17:15:01 crc kubenswrapper[4918]: I0319 17:15:01.803163 4918 generic.go:334] "Generic (PLEG): container finished" podID="22f4bb2c-9cf8-4750-99ea-9d86d180343f" containerID="6da16dc3e579b80b8736e16a0fb2f1d544e642f1704274b7f45275a8008a45a9" exitCode=0 Mar 19 17:15:01 crc kubenswrapper[4918]: I0319 17:15:01.803240 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" event={"ID":"22f4bb2c-9cf8-4750-99ea-9d86d180343f","Type":"ContainerDied","Data":"6da16dc3e579b80b8736e16a0fb2f1d544e642f1704274b7f45275a8008a45a9"} Mar 19 17:15:01 crc kubenswrapper[4918]: I0319 17:15:01.803386 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" event={"ID":"22f4bb2c-9cf8-4750-99ea-9d86d180343f","Type":"ContainerStarted","Data":"612b3ac72443f37d03230102bae6904bad085480d095f3f0b1f560ae03a273c6"} Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.319025 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.349748 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22f4bb2c-9cf8-4750-99ea-9d86d180343f-config-volume\") pod \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.349868 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7qgl\" (UniqueName: \"kubernetes.io/projected/22f4bb2c-9cf8-4750-99ea-9d86d180343f-kube-api-access-z7qgl\") pod \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.349990 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22f4bb2c-9cf8-4750-99ea-9d86d180343f-secret-volume\") pod \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\" (UID: \"22f4bb2c-9cf8-4750-99ea-9d86d180343f\") " Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.350941 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22f4bb2c-9cf8-4750-99ea-9d86d180343f-config-volume" (OuterVolumeSpecName: "config-volume") pod "22f4bb2c-9cf8-4750-99ea-9d86d180343f" (UID: "22f4bb2c-9cf8-4750-99ea-9d86d180343f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.375977 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f4bb2c-9cf8-4750-99ea-9d86d180343f-kube-api-access-z7qgl" (OuterVolumeSpecName: "kube-api-access-z7qgl") pod "22f4bb2c-9cf8-4750-99ea-9d86d180343f" (UID: "22f4bb2c-9cf8-4750-99ea-9d86d180343f"). InnerVolumeSpecName "kube-api-access-z7qgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.381209 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f4bb2c-9cf8-4750-99ea-9d86d180343f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "22f4bb2c-9cf8-4750-99ea-9d86d180343f" (UID: "22f4bb2c-9cf8-4750-99ea-9d86d180343f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.452835 4918 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22f4bb2c-9cf8-4750-99ea-9d86d180343f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.452870 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7qgl\" (UniqueName: \"kubernetes.io/projected/22f4bb2c-9cf8-4750-99ea-9d86d180343f-kube-api-access-z7qgl\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.452886 4918 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22f4bb2c-9cf8-4750-99ea-9d86d180343f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.820812 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" event={"ID":"22f4bb2c-9cf8-4750-99ea-9d86d180343f","Type":"ContainerDied","Data":"612b3ac72443f37d03230102bae6904bad085480d095f3f0b1f560ae03a273c6"} Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.820853 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="612b3ac72443f37d03230102bae6904bad085480d095f3f0b1f560ae03a273c6" Mar 19 17:15:03 crc kubenswrapper[4918]: I0319 17:15:03.820885 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf" Mar 19 17:15:04 crc kubenswrapper[4918]: I0319 17:15:04.398255 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn"] Mar 19 17:15:04 crc kubenswrapper[4918]: I0319 17:15:04.411324 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565630-wz4vn"] Mar 19 17:15:04 crc kubenswrapper[4918]: I0319 17:15:04.598938 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0" path="/var/lib/kubelet/pods/fa71e1c7-4bbd-4a5d-b07c-c9f232bc4ae0/volumes" Mar 19 17:15:04 crc kubenswrapper[4918]: I0319 17:15:04.939422 4918 scope.go:117] "RemoveContainer" containerID="2a81b9f96e4ae5a11ec1acd858dedd355d5aa7e037998aff8c498cbb31ab0917" Mar 19 17:15:04 crc kubenswrapper[4918]: I0319 17:15:04.985834 4918 scope.go:117] "RemoveContainer" containerID="d050a608d4cfc3d08f3a991c152c00b6609359ff283e54770c428e3a203f6df1" Mar 19 17:15:11 crc kubenswrapper[4918]: I0319 17:15:11.039197 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-cm55x"] Mar 19 17:15:11 crc kubenswrapper[4918]: I0319 17:15:11.052762 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-cm55x"] Mar 19 17:15:12 crc kubenswrapper[4918]: I0319 17:15:12.607479 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79829597-ef66-4d6f-946f-adaa9ec3d227" path="/var/lib/kubelet/pods/79829597-ef66-4d6f-946f-adaa9ec3d227/volumes" Mar 19 17:15:18 crc kubenswrapper[4918]: I0319 17:15:18.036469 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-864ls"] Mar 19 17:15:18 crc kubenswrapper[4918]: I0319 17:15:18.048175 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-864ls"] Mar 19 17:15:18 crc kubenswrapper[4918]: I0319 17:15:18.608880 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe6642e7-ede8-4dd3-8248-648f77d558b3" path="/var/lib/kubelet/pods/fe6642e7-ede8-4dd3-8248-648f77d558b3/volumes" Mar 19 17:15:36 crc kubenswrapper[4918]: I0319 17:15:36.154156 4918 generic.go:334] "Generic (PLEG): container finished" podID="10b0ee10-0449-4ca7-bece-7942e3bf9f86" containerID="970cabc4d7bc2032e3b6ad1ddefed5ec5c35fe8950851c9d7f8b446760214ec3" exitCode=0 Mar 19 17:15:36 crc kubenswrapper[4918]: I0319 17:15:36.154244 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" event={"ID":"10b0ee10-0449-4ca7-bece-7942e3bf9f86","Type":"ContainerDied","Data":"970cabc4d7bc2032e3b6ad1ddefed5ec5c35fe8950851c9d7f8b446760214ec3"} Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.601789 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723328 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-inventory\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723372 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723392 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-bootstrap-combined-ca-bundle\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723444 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-telemetry-combined-ca-bundle\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723483 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ssh-key-openstack-edpm-ipam\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723552 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723584 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-repo-setup-combined-ca-bundle\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723621 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-libvirt-combined-ca-bundle\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723667 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ovn-combined-ca-bundle\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723728 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdqfk\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-kube-api-access-bdqfk\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723763 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723792 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-ovn-default-certs-0\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.723807 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-neutron-metadata-combined-ca-bundle\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.725092 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-nova-combined-ca-bundle\") pod \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\" (UID: \"10b0ee10-0449-4ca7-bece-7942e3bf9f86\") " Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.730693 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.732584 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.732592 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.732692 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.733504 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.734954 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.736067 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.738741 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.739282 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.741420 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.742663 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-kube-api-access-bdqfk" (OuterVolumeSpecName: "kube-api-access-bdqfk") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "kube-api-access-bdqfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.746767 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.769908 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.784909 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-inventory" (OuterVolumeSpecName: "inventory") pod "10b0ee10-0449-4ca7-bece-7942e3bf9f86" (UID: "10b0ee10-0449-4ca7-bece-7942e3bf9f86"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828438 4918 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828479 4918 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828491 4918 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828503 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828514 4918 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828538 4918 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828548 4918 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828557 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828565 4918 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828575 4918 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828596 4918 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828608 4918 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10b0ee10-0449-4ca7-bece-7942e3bf9f86-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828617 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdqfk\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-kube-api-access-bdqfk\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:37 crc kubenswrapper[4918]: I0319 17:15:37.828625 4918 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/10b0ee10-0449-4ca7-bece-7942e3bf9f86-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.179890 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" event={"ID":"10b0ee10-0449-4ca7-bece-7942e3bf9f86","Type":"ContainerDied","Data":"78c9bb87d748c490fbb12a79bc79576fa52ea72d8dcc20660501b2da3f845290"} Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.179942 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78c9bb87d748c490fbb12a79bc79576fa52ea72d8dcc20660501b2da3f845290" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.180008 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.329564 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj"] Mar 19 17:15:38 crc kubenswrapper[4918]: E0319 17:15:38.330112 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f4bb2c-9cf8-4750-99ea-9d86d180343f" containerName="collect-profiles" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.330137 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f4bb2c-9cf8-4750-99ea-9d86d180343f" containerName="collect-profiles" Mar 19 17:15:38 crc kubenswrapper[4918]: E0319 17:15:38.330180 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b0ee10-0449-4ca7-bece-7942e3bf9f86" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.330190 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b0ee10-0449-4ca7-bece-7942e3bf9f86" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.330543 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b0ee10-0449-4ca7-bece-7942e3bf9f86" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.330573 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f4bb2c-9cf8-4750-99ea-9d86d180343f" containerName="collect-profiles" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.331626 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.334182 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.334315 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.334560 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.334191 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.339328 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.357697 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj"] Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.440482 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.440856 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.441012 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.441161 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.441370 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z582z\" (UniqueName: \"kubernetes.io/projected/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-kube-api-access-z582z\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.544035 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z582z\" (UniqueName: \"kubernetes.io/projected/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-kube-api-access-z582z\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.544145 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.544183 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.544241 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.544343 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.545754 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.548429 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.549945 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.549968 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.576860 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z582z\" (UniqueName: \"kubernetes.io/projected/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-kube-api-access-z582z\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4smlj\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:38 crc kubenswrapper[4918]: I0319 17:15:38.656896 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:15:39 crc kubenswrapper[4918]: I0319 17:15:39.196599 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj"] Mar 19 17:15:40 crc kubenswrapper[4918]: I0319 17:15:40.203780 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" event={"ID":"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3","Type":"ContainerStarted","Data":"2d161e587946fb1f87c01d1a7b11c8207ecf3f9fb3fa0eb5d9f6a88ac31edaef"} Mar 19 17:15:40 crc kubenswrapper[4918]: I0319 17:15:40.204222 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" event={"ID":"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3","Type":"ContainerStarted","Data":"97dfd1955284524c224cd691d75e43211bf3116bfc150b62e5a097901fc673dd"} Mar 19 17:15:40 crc kubenswrapper[4918]: I0319 17:15:40.226861 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" podStartSLOduration=1.67648835 podStartE2EDuration="2.226844239s" podCreationTimestamp="2026-03-19 17:15:38 +0000 UTC" firstStartedPulling="2026-03-19 17:15:39.213670698 +0000 UTC m=+2151.335869946" lastFinishedPulling="2026-03-19 17:15:39.764026587 +0000 UTC m=+2151.886225835" observedRunningTime="2026-03-19 17:15:40.222924811 +0000 UTC m=+2152.345124059" watchObservedRunningTime="2026-03-19 17:15:40.226844239 +0000 UTC m=+2152.349043487" Mar 19 17:16:00 crc kubenswrapper[4918]: I0319 17:16:00.155068 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565676-c6kk8"] Mar 19 17:16:00 crc kubenswrapper[4918]: I0319 17:16:00.157253 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565676-c6kk8" Mar 19 17:16:00 crc kubenswrapper[4918]: I0319 17:16:00.159567 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:16:00 crc kubenswrapper[4918]: I0319 17:16:00.159591 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:16:00 crc kubenswrapper[4918]: I0319 17:16:00.159601 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:16:00 crc kubenswrapper[4918]: I0319 17:16:00.173851 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565676-c6kk8"] Mar 19 17:16:00 crc kubenswrapper[4918]: I0319 17:16:00.317054 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wns\" (UniqueName: \"kubernetes.io/projected/10685f39-0b1e-4912-812f-39193409fdbd-kube-api-access-s7wns\") pod \"auto-csr-approver-29565676-c6kk8\" (UID: \"10685f39-0b1e-4912-812f-39193409fdbd\") " pod="openshift-infra/auto-csr-approver-29565676-c6kk8" Mar 19 17:16:00 crc kubenswrapper[4918]: I0319 17:16:00.418994 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wns\" (UniqueName: \"kubernetes.io/projected/10685f39-0b1e-4912-812f-39193409fdbd-kube-api-access-s7wns\") pod \"auto-csr-approver-29565676-c6kk8\" (UID: \"10685f39-0b1e-4912-812f-39193409fdbd\") " pod="openshift-infra/auto-csr-approver-29565676-c6kk8" Mar 19 17:16:00 crc kubenswrapper[4918]: I0319 17:16:00.441409 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wns\" (UniqueName: \"kubernetes.io/projected/10685f39-0b1e-4912-812f-39193409fdbd-kube-api-access-s7wns\") pod \"auto-csr-approver-29565676-c6kk8\" (UID: \"10685f39-0b1e-4912-812f-39193409fdbd\") " pod="openshift-infra/auto-csr-approver-29565676-c6kk8" Mar 19 17:16:00 crc kubenswrapper[4918]: I0319 17:16:00.478950 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565676-c6kk8" Mar 19 17:16:00 crc kubenswrapper[4918]: I0319 17:16:00.946386 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565676-c6kk8"] Mar 19 17:16:00 crc kubenswrapper[4918]: W0319 17:16:00.951839 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10685f39_0b1e_4912_812f_39193409fdbd.slice/crio-60e9295adda180238385c484713132641a97471bc3f2f94e97ed4de3fa89adf1 WatchSource:0}: Error finding container 60e9295adda180238385c484713132641a97471bc3f2f94e97ed4de3fa89adf1: Status 404 returned error can't find the container with id 60e9295adda180238385c484713132641a97471bc3f2f94e97ed4de3fa89adf1 Mar 19 17:16:01 crc kubenswrapper[4918]: I0319 17:16:01.444777 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565676-c6kk8" event={"ID":"10685f39-0b1e-4912-812f-39193409fdbd","Type":"ContainerStarted","Data":"60e9295adda180238385c484713132641a97471bc3f2f94e97ed4de3fa89adf1"} Mar 19 17:16:03 crc kubenswrapper[4918]: I0319 17:16:03.477624 4918 generic.go:334] "Generic (PLEG): container finished" podID="10685f39-0b1e-4912-812f-39193409fdbd" containerID="6674d7d5abca04ca2789cb15c2bb77ced4a522eb3572a3300465646d8837e94a" exitCode=0 Mar 19 17:16:03 crc kubenswrapper[4918]: I0319 17:16:03.477710 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565676-c6kk8" event={"ID":"10685f39-0b1e-4912-812f-39193409fdbd","Type":"ContainerDied","Data":"6674d7d5abca04ca2789cb15c2bb77ced4a522eb3572a3300465646d8837e94a"} Mar 19 17:16:04 crc kubenswrapper[4918]: I0319 17:16:04.893020 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565676-c6kk8" Mar 19 17:16:05 crc kubenswrapper[4918]: I0319 17:16:05.019220 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7wns\" (UniqueName: \"kubernetes.io/projected/10685f39-0b1e-4912-812f-39193409fdbd-kube-api-access-s7wns\") pod \"10685f39-0b1e-4912-812f-39193409fdbd\" (UID: \"10685f39-0b1e-4912-812f-39193409fdbd\") " Mar 19 17:16:05 crc kubenswrapper[4918]: I0319 17:16:05.028684 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10685f39-0b1e-4912-812f-39193409fdbd-kube-api-access-s7wns" (OuterVolumeSpecName: "kube-api-access-s7wns") pod "10685f39-0b1e-4912-812f-39193409fdbd" (UID: "10685f39-0b1e-4912-812f-39193409fdbd"). InnerVolumeSpecName "kube-api-access-s7wns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:16:05 crc kubenswrapper[4918]: I0319 17:16:05.098330 4918 scope.go:117] "RemoveContainer" containerID="94a1f4ca00cd7a79f0f012ec819ea8ce40dee0795880662a9c6f9f681e236859" Mar 19 17:16:05 crc kubenswrapper[4918]: I0319 17:16:05.122325 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7wns\" (UniqueName: \"kubernetes.io/projected/10685f39-0b1e-4912-812f-39193409fdbd-kube-api-access-s7wns\") on node \"crc\" DevicePath \"\"" Mar 19 17:16:05 crc kubenswrapper[4918]: I0319 17:16:05.137087 4918 scope.go:117] "RemoveContainer" containerID="27a6cf600dc0f973764c2f57988e9993a1b4528b4d219b9a33889a3f9b07afc3" Mar 19 17:16:05 crc kubenswrapper[4918]: I0319 17:16:05.499906 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565676-c6kk8" event={"ID":"10685f39-0b1e-4912-812f-39193409fdbd","Type":"ContainerDied","Data":"60e9295adda180238385c484713132641a97471bc3f2f94e97ed4de3fa89adf1"} Mar 19 17:16:05 crc kubenswrapper[4918]: I0319 17:16:05.500255 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60e9295adda180238385c484713132641a97471bc3f2f94e97ed4de3fa89adf1" Mar 19 17:16:05 crc kubenswrapper[4918]: I0319 17:16:05.499972 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565676-c6kk8" Mar 19 17:16:05 crc kubenswrapper[4918]: I0319 17:16:05.969002 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565670-lx56g"] Mar 19 17:16:05 crc kubenswrapper[4918]: I0319 17:16:05.977580 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565670-lx56g"] Mar 19 17:16:06 crc kubenswrapper[4918]: I0319 17:16:06.599942 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd12134-68b7-44a4-aa04-33af054e4f4f" path="/var/lib/kubelet/pods/acd12134-68b7-44a4-aa04-33af054e4f4f/volumes" Mar 19 17:16:28 crc kubenswrapper[4918]: I0319 17:16:28.211470 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:16:28 crc kubenswrapper[4918]: I0319 17:16:28.212074 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:16:42 crc kubenswrapper[4918]: I0319 17:16:42.880135 4918 generic.go:334] "Generic (PLEG): container finished" podID="28c0e0ec-e1b6-4937-89dc-f09d42d97bd3" containerID="2d161e587946fb1f87c01d1a7b11c8207ecf3f9fb3fa0eb5d9f6a88ac31edaef" exitCode=0 Mar 19 17:16:42 crc kubenswrapper[4918]: I0319 17:16:42.880232 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" event={"ID":"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3","Type":"ContainerDied","Data":"2d161e587946fb1f87c01d1a7b11c8207ecf3f9fb3fa0eb5d9f6a88ac31edaef"} Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.350757 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.403915 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ssh-key-openstack-edpm-ipam\") pod \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.403974 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovn-combined-ca-bundle\") pod \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.404012 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-inventory\") pod \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.404139 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z582z\" (UniqueName: \"kubernetes.io/projected/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-kube-api-access-z582z\") pod \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.404207 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovncontroller-config-0\") pod \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\" (UID: \"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3\") " Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.409409 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "28c0e0ec-e1b6-4937-89dc-f09d42d97bd3" (UID: "28c0e0ec-e1b6-4937-89dc-f09d42d97bd3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.409671 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-kube-api-access-z582z" (OuterVolumeSpecName: "kube-api-access-z582z") pod "28c0e0ec-e1b6-4937-89dc-f09d42d97bd3" (UID: "28c0e0ec-e1b6-4937-89dc-f09d42d97bd3"). InnerVolumeSpecName "kube-api-access-z582z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.434294 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "28c0e0ec-e1b6-4937-89dc-f09d42d97bd3" (UID: "28c0e0ec-e1b6-4937-89dc-f09d42d97bd3"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.436213 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-inventory" (OuterVolumeSpecName: "inventory") pod "28c0e0ec-e1b6-4937-89dc-f09d42d97bd3" (UID: "28c0e0ec-e1b6-4937-89dc-f09d42d97bd3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.437685 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28c0e0ec-e1b6-4937-89dc-f09d42d97bd3" (UID: "28c0e0ec-e1b6-4937-89dc-f09d42d97bd3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.506317 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.506354 4918 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.506366 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.506377 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z582z\" (UniqueName: \"kubernetes.io/projected/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-kube-api-access-z582z\") on node \"crc\" DevicePath \"\"" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.506390 4918 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28c0e0ec-e1b6-4937-89dc-f09d42d97bd3-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.906823 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" event={"ID":"28c0e0ec-e1b6-4937-89dc-f09d42d97bd3","Type":"ContainerDied","Data":"97dfd1955284524c224cd691d75e43211bf3116bfc150b62e5a097901fc673dd"} Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.906870 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97dfd1955284524c224cd691d75e43211bf3116bfc150b62e5a097901fc673dd" Mar 19 17:16:44 crc kubenswrapper[4918]: I0319 17:16:44.906967 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4smlj" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.004241 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt"] Mar 19 17:16:45 crc kubenswrapper[4918]: E0319 17:16:45.005015 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c0e0ec-e1b6-4937-89dc-f09d42d97bd3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.005036 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c0e0ec-e1b6-4937-89dc-f09d42d97bd3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 17:16:45 crc kubenswrapper[4918]: E0319 17:16:45.005078 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10685f39-0b1e-4912-812f-39193409fdbd" containerName="oc" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.005087 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="10685f39-0b1e-4912-812f-39193409fdbd" containerName="oc" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.005283 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="28c0e0ec-e1b6-4937-89dc-f09d42d97bd3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.005300 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="10685f39-0b1e-4912-812f-39193409fdbd" containerName="oc" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.006163 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.015562 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.015647 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.016221 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt"] Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.016540 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.016826 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.018063 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.018115 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.124514 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.124638 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.124681 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.124752 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.124810 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.125000 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tfqm\" (UniqueName: \"kubernetes.io/projected/e2200bcd-af51-40ea-aef4-e601b73f6f78-kube-api-access-5tfqm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.226818 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.227093 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tfqm\" (UniqueName: \"kubernetes.io/projected/e2200bcd-af51-40ea-aef4-e601b73f6f78-kube-api-access-5tfqm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.227187 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.227246 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.227288 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.227350 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.231664 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.232124 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.233035 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.233171 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.233359 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.246718 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tfqm\" (UniqueName: \"kubernetes.io/projected/e2200bcd-af51-40ea-aef4-e601b73f6f78-kube-api-access-5tfqm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.327758 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:16:45 crc kubenswrapper[4918]: W0319 17:16:45.920899 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2200bcd_af51_40ea_aef4_e601b73f6f78.slice/crio-df1410a4b176c03fd7496fb989e8bc188a11e6327e057b821bd20107659068f2 WatchSource:0}: Error finding container df1410a4b176c03fd7496fb989e8bc188a11e6327e057b821bd20107659068f2: Status 404 returned error can't find the container with id df1410a4b176c03fd7496fb989e8bc188a11e6327e057b821bd20107659068f2 Mar 19 17:16:45 crc kubenswrapper[4918]: I0319 17:16:45.920927 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt"] Mar 19 17:16:46 crc kubenswrapper[4918]: I0319 17:16:46.927513 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" event={"ID":"e2200bcd-af51-40ea-aef4-e601b73f6f78","Type":"ContainerStarted","Data":"2971a2e01c7742e804be82b8caa4ddd78eb61c1af0babbfe656e97e3e4781845"} Mar 19 17:16:46 crc kubenswrapper[4918]: I0319 17:16:46.927909 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" event={"ID":"e2200bcd-af51-40ea-aef4-e601b73f6f78","Type":"ContainerStarted","Data":"df1410a4b176c03fd7496fb989e8bc188a11e6327e057b821bd20107659068f2"} Mar 19 17:16:46 crc kubenswrapper[4918]: I0319 17:16:46.957929 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" podStartSLOduration=2.49640034 podStartE2EDuration="2.957909686s" podCreationTimestamp="2026-03-19 17:16:44 +0000 UTC" firstStartedPulling="2026-03-19 17:16:45.924862459 +0000 UTC m=+2218.047061707" lastFinishedPulling="2026-03-19 17:16:46.386371805 +0000 UTC m=+2218.508571053" observedRunningTime="2026-03-19 17:16:46.948086986 +0000 UTC m=+2219.070286244" watchObservedRunningTime="2026-03-19 17:16:46.957909686 +0000 UTC m=+2219.080108924" Mar 19 17:16:58 crc kubenswrapper[4918]: I0319 17:16:58.211851 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:16:58 crc kubenswrapper[4918]: I0319 17:16:58.212535 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:17:05 crc kubenswrapper[4918]: I0319 17:17:05.250043 4918 scope.go:117] "RemoveContainer" containerID="9b8265d9e7a37ca5f1baa12eb33fcd448229e149c9de8015409b9f4932627c3a" Mar 19 17:17:28 crc kubenswrapper[4918]: I0319 17:17:28.211618 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:17:28 crc kubenswrapper[4918]: I0319 17:17:28.212058 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:17:28 crc kubenswrapper[4918]: I0319 17:17:28.212138 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 17:17:28 crc kubenswrapper[4918]: I0319 17:17:28.212863 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 17:17:28 crc kubenswrapper[4918]: I0319 17:17:28.212918 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" gracePeriod=600 Mar 19 17:17:28 crc kubenswrapper[4918]: E0319 17:17:28.338158 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:17:28 crc kubenswrapper[4918]: I0319 17:17:28.376640 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" exitCode=0 Mar 19 17:17:28 crc kubenswrapper[4918]: I0319 17:17:28.376707 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74"} Mar 19 17:17:28 crc kubenswrapper[4918]: I0319 17:17:28.376797 4918 scope.go:117] "RemoveContainer" containerID="fa72b2dda22c6725c1ba0de7800b932e612ef8adf06d0be5f45d4b0d25a364f6" Mar 19 17:17:28 crc kubenswrapper[4918]: I0319 17:17:28.377463 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:17:28 crc kubenswrapper[4918]: E0319 17:17:28.377790 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:17:35 crc kubenswrapper[4918]: I0319 17:17:35.452249 4918 generic.go:334] "Generic (PLEG): container finished" podID="e2200bcd-af51-40ea-aef4-e601b73f6f78" containerID="2971a2e01c7742e804be82b8caa4ddd78eb61c1af0babbfe656e97e3e4781845" exitCode=0 Mar 19 17:17:35 crc kubenswrapper[4918]: I0319 17:17:35.452339 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" event={"ID":"e2200bcd-af51-40ea-aef4-e601b73f6f78","Type":"ContainerDied","Data":"2971a2e01c7742e804be82b8caa4ddd78eb61c1af0babbfe656e97e3e4781845"} Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.079044 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.247652 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-ssh-key-openstack-edpm-ipam\") pod \"e2200bcd-af51-40ea-aef4-e601b73f6f78\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.247762 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-metadata-combined-ca-bundle\") pod \"e2200bcd-af51-40ea-aef4-e601b73f6f78\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.247881 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tfqm\" (UniqueName: \"kubernetes.io/projected/e2200bcd-af51-40ea-aef4-e601b73f6f78-kube-api-access-5tfqm\") pod \"e2200bcd-af51-40ea-aef4-e601b73f6f78\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.247903 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-inventory\") pod \"e2200bcd-af51-40ea-aef4-e601b73f6f78\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.247997 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-nova-metadata-neutron-config-0\") pod \"e2200bcd-af51-40ea-aef4-e601b73f6f78\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.248098 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e2200bcd-af51-40ea-aef4-e601b73f6f78\" (UID: \"e2200bcd-af51-40ea-aef4-e601b73f6f78\") " Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.258102 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2200bcd-af51-40ea-aef4-e601b73f6f78-kube-api-access-5tfqm" (OuterVolumeSpecName: "kube-api-access-5tfqm") pod "e2200bcd-af51-40ea-aef4-e601b73f6f78" (UID: "e2200bcd-af51-40ea-aef4-e601b73f6f78"). InnerVolumeSpecName "kube-api-access-5tfqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.258195 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e2200bcd-af51-40ea-aef4-e601b73f6f78" (UID: "e2200bcd-af51-40ea-aef4-e601b73f6f78"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.282846 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e2200bcd-af51-40ea-aef4-e601b73f6f78" (UID: "e2200bcd-af51-40ea-aef4-e601b73f6f78"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.284498 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e2200bcd-af51-40ea-aef4-e601b73f6f78" (UID: "e2200bcd-af51-40ea-aef4-e601b73f6f78"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.289049 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e2200bcd-af51-40ea-aef4-e601b73f6f78" (UID: "e2200bcd-af51-40ea-aef4-e601b73f6f78"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.292030 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-inventory" (OuterVolumeSpecName: "inventory") pod "e2200bcd-af51-40ea-aef4-e601b73f6f78" (UID: "e2200bcd-af51-40ea-aef4-e601b73f6f78"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.350442 4918 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.350698 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tfqm\" (UniqueName: \"kubernetes.io/projected/e2200bcd-af51-40ea-aef4-e601b73f6f78-kube-api-access-5tfqm\") on node \"crc\" DevicePath \"\"" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.350812 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.350910 4918 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.350992 4918 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.351070 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2200bcd-af51-40ea-aef4-e601b73f6f78-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.474319 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" event={"ID":"e2200bcd-af51-40ea-aef4-e601b73f6f78","Type":"ContainerDied","Data":"df1410a4b176c03fd7496fb989e8bc188a11e6327e057b821bd20107659068f2"} Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.474370 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df1410a4b176c03fd7496fb989e8bc188a11e6327e057b821bd20107659068f2" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.474838 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.651881 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx"] Mar 19 17:17:37 crc kubenswrapper[4918]: E0319 17:17:37.652672 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2200bcd-af51-40ea-aef4-e601b73f6f78" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.652769 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2200bcd-af51-40ea-aef4-e601b73f6f78" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.653154 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2200bcd-af51-40ea-aef4-e601b73f6f78" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.654338 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.658890 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.658959 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.658969 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.659884 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.663927 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx"] Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.666066 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.757792 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.757848 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.758066 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.758131 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8828m\" (UniqueName: \"kubernetes.io/projected/9cc96cf8-b975-4f49-8032-bb1d31580e7b-kube-api-access-8828m\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.758171 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: E0319 17:17:37.762056 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2200bcd_af51_40ea_aef4_e601b73f6f78.slice/crio-df1410a4b176c03fd7496fb989e8bc188a11e6327e057b821bd20107659068f2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2200bcd_af51_40ea_aef4_e601b73f6f78.slice\": RecentStats: unable to find data in memory cache]" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.859972 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.860347 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8828m\" (UniqueName: \"kubernetes.io/projected/9cc96cf8-b975-4f49-8032-bb1d31580e7b-kube-api-access-8828m\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.860378 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.860418 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.860440 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.867237 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.867237 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.867444 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.873317 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.877521 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8828m\" (UniqueName: \"kubernetes.io/projected/9cc96cf8-b975-4f49-8032-bb1d31580e7b-kube-api-access-8828m\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:37 crc kubenswrapper[4918]: I0319 17:17:37.991356 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:17:38 crc kubenswrapper[4918]: I0319 17:17:38.572594 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx"] Mar 19 17:17:39 crc kubenswrapper[4918]: I0319 17:17:39.498157 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" event={"ID":"9cc96cf8-b975-4f49-8032-bb1d31580e7b","Type":"ContainerStarted","Data":"9f0e20457351871a003e0dff1b43a4b4883500d7cf766459355369e34e2e51ba"} Mar 19 17:17:39 crc kubenswrapper[4918]: I0319 17:17:39.498414 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" event={"ID":"9cc96cf8-b975-4f49-8032-bb1d31580e7b","Type":"ContainerStarted","Data":"06bf81c5cb05d024a475287c48cddaf9e442d77fc2a3e045d748909c1f2eff2a"} Mar 19 17:17:39 crc kubenswrapper[4918]: I0319 17:17:39.520034 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" podStartSLOduration=2.100936398 podStartE2EDuration="2.520003848s" podCreationTimestamp="2026-03-19 17:17:37 +0000 UTC" firstStartedPulling="2026-03-19 17:17:38.570381684 +0000 UTC m=+2270.692580922" lastFinishedPulling="2026-03-19 17:17:38.989449084 +0000 UTC m=+2271.111648372" observedRunningTime="2026-03-19 17:17:39.517814708 +0000 UTC m=+2271.640013956" watchObservedRunningTime="2026-03-19 17:17:39.520003848 +0000 UTC m=+2271.642203136" Mar 19 17:17:43 crc kubenswrapper[4918]: I0319 17:17:43.587425 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:17:43 crc kubenswrapper[4918]: E0319 17:17:43.588494 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:17:57 crc kubenswrapper[4918]: I0319 17:17:57.586986 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:17:57 crc kubenswrapper[4918]: E0319 17:17:57.587749 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:18:00 crc kubenswrapper[4918]: I0319 17:18:00.155541 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565678-zrq4r"] Mar 19 17:18:00 crc kubenswrapper[4918]: I0319 17:18:00.157319 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565678-zrq4r" Mar 19 17:18:00 crc kubenswrapper[4918]: I0319 17:18:00.159324 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:18:00 crc kubenswrapper[4918]: I0319 17:18:00.159588 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:18:00 crc kubenswrapper[4918]: I0319 17:18:00.159962 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:18:00 crc kubenswrapper[4918]: I0319 17:18:00.172076 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565678-zrq4r"] Mar 19 17:18:00 crc kubenswrapper[4918]: I0319 17:18:00.290113 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn8gw\" (UniqueName: \"kubernetes.io/projected/19e546ea-cea1-49e7-b212-797cd0679cd0-kube-api-access-rn8gw\") pod \"auto-csr-approver-29565678-zrq4r\" (UID: \"19e546ea-cea1-49e7-b212-797cd0679cd0\") " pod="openshift-infra/auto-csr-approver-29565678-zrq4r" Mar 19 17:18:00 crc kubenswrapper[4918]: I0319 17:18:00.392116 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn8gw\" (UniqueName: \"kubernetes.io/projected/19e546ea-cea1-49e7-b212-797cd0679cd0-kube-api-access-rn8gw\") pod \"auto-csr-approver-29565678-zrq4r\" (UID: \"19e546ea-cea1-49e7-b212-797cd0679cd0\") " pod="openshift-infra/auto-csr-approver-29565678-zrq4r" Mar 19 17:18:00 crc kubenswrapper[4918]: I0319 17:18:00.412492 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn8gw\" (UniqueName: \"kubernetes.io/projected/19e546ea-cea1-49e7-b212-797cd0679cd0-kube-api-access-rn8gw\") pod \"auto-csr-approver-29565678-zrq4r\" (UID: \"19e546ea-cea1-49e7-b212-797cd0679cd0\") " pod="openshift-infra/auto-csr-approver-29565678-zrq4r" Mar 19 17:18:00 crc kubenswrapper[4918]: I0319 17:18:00.478075 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565678-zrq4r" Mar 19 17:18:00 crc kubenswrapper[4918]: I0319 17:18:00.910792 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565678-zrq4r"] Mar 19 17:18:00 crc kubenswrapper[4918]: W0319 17:18:00.916736 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19e546ea_cea1_49e7_b212_797cd0679cd0.slice/crio-b3e4f0f44f650f4d4584175924dccb728be87d0959f4d19be7162c8f0bc337c0 WatchSource:0}: Error finding container b3e4f0f44f650f4d4584175924dccb728be87d0959f4d19be7162c8f0bc337c0: Status 404 returned error can't find the container with id b3e4f0f44f650f4d4584175924dccb728be87d0959f4d19be7162c8f0bc337c0 Mar 19 17:18:01 crc kubenswrapper[4918]: I0319 17:18:01.752316 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565678-zrq4r" event={"ID":"19e546ea-cea1-49e7-b212-797cd0679cd0","Type":"ContainerStarted","Data":"b3e4f0f44f650f4d4584175924dccb728be87d0959f4d19be7162c8f0bc337c0"} Mar 19 17:18:02 crc kubenswrapper[4918]: I0319 17:18:02.762097 4918 generic.go:334] "Generic (PLEG): container finished" podID="19e546ea-cea1-49e7-b212-797cd0679cd0" containerID="a1f374cdfb67f4ca19a98e2ec4b66533a71d2c39dd3858c747501c78e0683b2e" exitCode=0 Mar 19 17:18:02 crc kubenswrapper[4918]: I0319 17:18:02.762143 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565678-zrq4r" event={"ID":"19e546ea-cea1-49e7-b212-797cd0679cd0","Type":"ContainerDied","Data":"a1f374cdfb67f4ca19a98e2ec4b66533a71d2c39dd3858c747501c78e0683b2e"} Mar 19 17:18:04 crc kubenswrapper[4918]: I0319 17:18:04.239608 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565678-zrq4r" Mar 19 17:18:04 crc kubenswrapper[4918]: I0319 17:18:04.397038 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn8gw\" (UniqueName: \"kubernetes.io/projected/19e546ea-cea1-49e7-b212-797cd0679cd0-kube-api-access-rn8gw\") pod \"19e546ea-cea1-49e7-b212-797cd0679cd0\" (UID: \"19e546ea-cea1-49e7-b212-797cd0679cd0\") " Mar 19 17:18:04 crc kubenswrapper[4918]: I0319 17:18:04.403238 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e546ea-cea1-49e7-b212-797cd0679cd0-kube-api-access-rn8gw" (OuterVolumeSpecName: "kube-api-access-rn8gw") pod "19e546ea-cea1-49e7-b212-797cd0679cd0" (UID: "19e546ea-cea1-49e7-b212-797cd0679cd0"). InnerVolumeSpecName "kube-api-access-rn8gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:18:04 crc kubenswrapper[4918]: I0319 17:18:04.500192 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn8gw\" (UniqueName: \"kubernetes.io/projected/19e546ea-cea1-49e7-b212-797cd0679cd0-kube-api-access-rn8gw\") on node \"crc\" DevicePath \"\"" Mar 19 17:18:05 crc kubenswrapper[4918]: I0319 17:18:04.785267 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565678-zrq4r" event={"ID":"19e546ea-cea1-49e7-b212-797cd0679cd0","Type":"ContainerDied","Data":"b3e4f0f44f650f4d4584175924dccb728be87d0959f4d19be7162c8f0bc337c0"} Mar 19 17:18:05 crc kubenswrapper[4918]: I0319 17:18:04.785305 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3e4f0f44f650f4d4584175924dccb728be87d0959f4d19be7162c8f0bc337c0" Mar 19 17:18:05 crc kubenswrapper[4918]: I0319 17:18:04.785354 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565678-zrq4r" Mar 19 17:18:05 crc kubenswrapper[4918]: I0319 17:18:05.315683 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565672-r8jfj"] Mar 19 17:18:05 crc kubenswrapper[4918]: I0319 17:18:05.324439 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565672-r8jfj"] Mar 19 17:18:06 crc kubenswrapper[4918]: I0319 17:18:06.605754 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77605f10-6a56-4633-8433-3b6078dec967" path="/var/lib/kubelet/pods/77605f10-6a56-4633-8433-3b6078dec967/volumes" Mar 19 17:18:10 crc kubenswrapper[4918]: I0319 17:18:10.586427 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:18:10 crc kubenswrapper[4918]: E0319 17:18:10.587015 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:18:24 crc kubenswrapper[4918]: I0319 17:18:24.586583 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:18:24 crc kubenswrapper[4918]: E0319 17:18:24.587486 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:18:38 crc kubenswrapper[4918]: I0319 17:18:38.587603 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:18:38 crc kubenswrapper[4918]: E0319 17:18:38.589018 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:18:50 crc kubenswrapper[4918]: I0319 17:18:50.586770 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:18:50 crc kubenswrapper[4918]: E0319 17:18:50.587493 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:19:05 crc kubenswrapper[4918]: I0319 17:19:05.371221 4918 scope.go:117] "RemoveContainer" containerID="865bd646348bd4ee3b7e00bdd02223d66d401a48c7d3b69c29648039005b4e50" Mar 19 17:19:05 crc kubenswrapper[4918]: I0319 17:19:05.587047 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:19:05 crc kubenswrapper[4918]: E0319 17:19:05.587424 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:19:17 crc kubenswrapper[4918]: I0319 17:19:17.587353 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:19:17 crc kubenswrapper[4918]: E0319 17:19:17.588640 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.721815 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rs5r6"] Mar 19 17:19:28 crc kubenswrapper[4918]: E0319 17:19:28.722936 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e546ea-cea1-49e7-b212-797cd0679cd0" containerName="oc" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.722953 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e546ea-cea1-49e7-b212-797cd0679cd0" containerName="oc" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.723232 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e546ea-cea1-49e7-b212-797cd0679cd0" containerName="oc" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.724956 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.731619 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rs5r6"] Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.801615 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frmfl\" (UniqueName: \"kubernetes.io/projected/409a1f91-aaba-48bc-85ab-af8da08dabb9-kube-api-access-frmfl\") pod \"redhat-operators-rs5r6\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.801745 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-utilities\") pod \"redhat-operators-rs5r6\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.801786 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-catalog-content\") pod \"redhat-operators-rs5r6\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.904218 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-utilities\") pod \"redhat-operators-rs5r6\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.904287 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-catalog-content\") pod \"redhat-operators-rs5r6\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.904560 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frmfl\" (UniqueName: \"kubernetes.io/projected/409a1f91-aaba-48bc-85ab-af8da08dabb9-kube-api-access-frmfl\") pod \"redhat-operators-rs5r6\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.904914 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-catalog-content\") pod \"redhat-operators-rs5r6\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.905112 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-utilities\") pod \"redhat-operators-rs5r6\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:28 crc kubenswrapper[4918]: I0319 17:19:28.927672 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmfl\" (UniqueName: \"kubernetes.io/projected/409a1f91-aaba-48bc-85ab-af8da08dabb9-kube-api-access-frmfl\") pod \"redhat-operators-rs5r6\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:29 crc kubenswrapper[4918]: I0319 17:19:29.049595 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:29 crc kubenswrapper[4918]: I0319 17:19:29.527914 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rs5r6"] Mar 19 17:19:29 crc kubenswrapper[4918]: I0319 17:19:29.683743 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs5r6" event={"ID":"409a1f91-aaba-48bc-85ab-af8da08dabb9","Type":"ContainerStarted","Data":"0e9206f6220a09b0d224d4757fdba8357203d249d0f92be48d75978b2a411d32"} Mar 19 17:19:30 crc kubenswrapper[4918]: I0319 17:19:30.696307 4918 generic.go:334] "Generic (PLEG): container finished" podID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerID="39e6bdf8f48ccd24f98e5d5ce1c350e97adcca4588e91ba31aaa33d6f5966c28" exitCode=0 Mar 19 17:19:30 crc kubenswrapper[4918]: I0319 17:19:30.696410 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs5r6" event={"ID":"409a1f91-aaba-48bc-85ab-af8da08dabb9","Type":"ContainerDied","Data":"39e6bdf8f48ccd24f98e5d5ce1c350e97adcca4588e91ba31aaa33d6f5966c28"} Mar 19 17:19:30 crc kubenswrapper[4918]: I0319 17:19:30.698584 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 17:19:32 crc kubenswrapper[4918]: I0319 17:19:32.586293 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:19:32 crc kubenswrapper[4918]: E0319 17:19:32.587068 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:19:32 crc kubenswrapper[4918]: I0319 17:19:32.729339 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs5r6" event={"ID":"409a1f91-aaba-48bc-85ab-af8da08dabb9","Type":"ContainerStarted","Data":"5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd"} Mar 19 17:19:37 crc kubenswrapper[4918]: I0319 17:19:37.792723 4918 generic.go:334] "Generic (PLEG): container finished" podID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerID="5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd" exitCode=0 Mar 19 17:19:37 crc kubenswrapper[4918]: I0319 17:19:37.792794 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs5r6" event={"ID":"409a1f91-aaba-48bc-85ab-af8da08dabb9","Type":"ContainerDied","Data":"5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd"} Mar 19 17:19:38 crc kubenswrapper[4918]: I0319 17:19:38.810425 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs5r6" event={"ID":"409a1f91-aaba-48bc-85ab-af8da08dabb9","Type":"ContainerStarted","Data":"c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c"} Mar 19 17:19:38 crc kubenswrapper[4918]: I0319 17:19:38.843732 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rs5r6" podStartSLOduration=3.228114239 podStartE2EDuration="10.843703988s" podCreationTimestamp="2026-03-19 17:19:28 +0000 UTC" firstStartedPulling="2026-03-19 17:19:30.69810026 +0000 UTC m=+2382.820299518" lastFinishedPulling="2026-03-19 17:19:38.313690019 +0000 UTC m=+2390.435889267" observedRunningTime="2026-03-19 17:19:38.83431243 +0000 UTC m=+2390.956511688" watchObservedRunningTime="2026-03-19 17:19:38.843703988 +0000 UTC m=+2390.965903256" Mar 19 17:19:39 crc kubenswrapper[4918]: I0319 17:19:39.050592 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:39 crc kubenswrapper[4918]: I0319 17:19:39.050685 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:40 crc kubenswrapper[4918]: I0319 17:19:40.107929 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rs5r6" podUID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerName="registry-server" probeResult="failure" output=< Mar 19 17:19:40 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 17:19:40 crc kubenswrapper[4918]: > Mar 19 17:19:44 crc kubenswrapper[4918]: I0319 17:19:44.587349 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:19:44 crc kubenswrapper[4918]: E0319 17:19:44.588200 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:19:49 crc kubenswrapper[4918]: I0319 17:19:49.128781 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:49 crc kubenswrapper[4918]: I0319 17:19:49.189043 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:49 crc kubenswrapper[4918]: I0319 17:19:49.371954 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rs5r6"] Mar 19 17:19:50 crc kubenswrapper[4918]: I0319 17:19:50.951701 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rs5r6" podUID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerName="registry-server" containerID="cri-o://c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c" gracePeriod=2 Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.488150 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.548687 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-catalog-content\") pod \"409a1f91-aaba-48bc-85ab-af8da08dabb9\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.548810 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-utilities\") pod \"409a1f91-aaba-48bc-85ab-af8da08dabb9\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.548931 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frmfl\" (UniqueName: \"kubernetes.io/projected/409a1f91-aaba-48bc-85ab-af8da08dabb9-kube-api-access-frmfl\") pod \"409a1f91-aaba-48bc-85ab-af8da08dabb9\" (UID: \"409a1f91-aaba-48bc-85ab-af8da08dabb9\") " Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.549907 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-utilities" (OuterVolumeSpecName: "utilities") pod "409a1f91-aaba-48bc-85ab-af8da08dabb9" (UID: "409a1f91-aaba-48bc-85ab-af8da08dabb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.554961 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/409a1f91-aaba-48bc-85ab-af8da08dabb9-kube-api-access-frmfl" (OuterVolumeSpecName: "kube-api-access-frmfl") pod "409a1f91-aaba-48bc-85ab-af8da08dabb9" (UID: "409a1f91-aaba-48bc-85ab-af8da08dabb9"). InnerVolumeSpecName "kube-api-access-frmfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.651792 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.651921 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frmfl\" (UniqueName: \"kubernetes.io/projected/409a1f91-aaba-48bc-85ab-af8da08dabb9-kube-api-access-frmfl\") on node \"crc\" DevicePath \"\"" Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.718174 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "409a1f91-aaba-48bc-85ab-af8da08dabb9" (UID: "409a1f91-aaba-48bc-85ab-af8da08dabb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.753609 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/409a1f91-aaba-48bc-85ab-af8da08dabb9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.964386 4918 generic.go:334] "Generic (PLEG): container finished" podID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerID="c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c" exitCode=0 Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.964470 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs5r6" event={"ID":"409a1f91-aaba-48bc-85ab-af8da08dabb9","Type":"ContainerDied","Data":"c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c"} Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.964794 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rs5r6" event={"ID":"409a1f91-aaba-48bc-85ab-af8da08dabb9","Type":"ContainerDied","Data":"0e9206f6220a09b0d224d4757fdba8357203d249d0f92be48d75978b2a411d32"} Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.964558 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rs5r6" Mar 19 17:19:51 crc kubenswrapper[4918]: I0319 17:19:51.964825 4918 scope.go:117] "RemoveContainer" containerID="c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c" Mar 19 17:19:52 crc kubenswrapper[4918]: I0319 17:19:52.004096 4918 scope.go:117] "RemoveContainer" containerID="5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd" Mar 19 17:19:52 crc kubenswrapper[4918]: I0319 17:19:52.041201 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rs5r6"] Mar 19 17:19:52 crc kubenswrapper[4918]: I0319 17:19:52.055114 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rs5r6"] Mar 19 17:19:52 crc kubenswrapper[4918]: I0319 17:19:52.069855 4918 scope.go:117] "RemoveContainer" containerID="39e6bdf8f48ccd24f98e5d5ce1c350e97adcca4588e91ba31aaa33d6f5966c28" Mar 19 17:19:52 crc kubenswrapper[4918]: I0319 17:19:52.096695 4918 scope.go:117] "RemoveContainer" containerID="c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c" Mar 19 17:19:52 crc kubenswrapper[4918]: E0319 17:19:52.097174 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c\": container with ID starting with c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c not found: ID does not exist" containerID="c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c" Mar 19 17:19:52 crc kubenswrapper[4918]: I0319 17:19:52.097233 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c"} err="failed to get container status \"c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c\": rpc error: code = NotFound desc = could not find container \"c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c\": container with ID starting with c3d3bf6cff8f997df5f3d490dba5294582b891945748d6b10f50b8ef217abc0c not found: ID does not exist" Mar 19 17:19:52 crc kubenswrapper[4918]: I0319 17:19:52.097270 4918 scope.go:117] "RemoveContainer" containerID="5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd" Mar 19 17:19:52 crc kubenswrapper[4918]: E0319 17:19:52.097795 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd\": container with ID starting with 5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd not found: ID does not exist" containerID="5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd" Mar 19 17:19:52 crc kubenswrapper[4918]: I0319 17:19:52.097832 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd"} err="failed to get container status \"5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd\": rpc error: code = NotFound desc = could not find container \"5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd\": container with ID starting with 5c2f8768ede3323bbd7ede72ee621d2de56f7669bfe0490f6a4f94bd204f48bd not found: ID does not exist" Mar 19 17:19:52 crc kubenswrapper[4918]: I0319 17:19:52.097853 4918 scope.go:117] "RemoveContainer" containerID="39e6bdf8f48ccd24f98e5d5ce1c350e97adcca4588e91ba31aaa33d6f5966c28" Mar 19 17:19:52 crc kubenswrapper[4918]: E0319 17:19:52.098160 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e6bdf8f48ccd24f98e5d5ce1c350e97adcca4588e91ba31aaa33d6f5966c28\": container with ID starting with 39e6bdf8f48ccd24f98e5d5ce1c350e97adcca4588e91ba31aaa33d6f5966c28 not found: ID does not exist" containerID="39e6bdf8f48ccd24f98e5d5ce1c350e97adcca4588e91ba31aaa33d6f5966c28" Mar 19 17:19:52 crc kubenswrapper[4918]: I0319 17:19:52.098184 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e6bdf8f48ccd24f98e5d5ce1c350e97adcca4588e91ba31aaa33d6f5966c28"} err="failed to get container status \"39e6bdf8f48ccd24f98e5d5ce1c350e97adcca4588e91ba31aaa33d6f5966c28\": rpc error: code = NotFound desc = could not find container \"39e6bdf8f48ccd24f98e5d5ce1c350e97adcca4588e91ba31aaa33d6f5966c28\": container with ID starting with 39e6bdf8f48ccd24f98e5d5ce1c350e97adcca4588e91ba31aaa33d6f5966c28 not found: ID does not exist" Mar 19 17:19:52 crc kubenswrapper[4918]: I0319 17:19:52.610825 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="409a1f91-aaba-48bc-85ab-af8da08dabb9" path="/var/lib/kubelet/pods/409a1f91-aaba-48bc-85ab-af8da08dabb9/volumes" Mar 19 17:19:56 crc kubenswrapper[4918]: I0319 17:19:56.587450 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:19:56 crc kubenswrapper[4918]: E0319 17:19:56.588610 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.151086 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565680-jffxr"] Mar 19 17:20:00 crc kubenswrapper[4918]: E0319 17:20:00.152205 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerName="registry-server" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.152218 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerName="registry-server" Mar 19 17:20:00 crc kubenswrapper[4918]: E0319 17:20:00.152236 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerName="extract-utilities" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.152244 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerName="extract-utilities" Mar 19 17:20:00 crc kubenswrapper[4918]: E0319 17:20:00.152276 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerName="extract-content" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.152283 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerName="extract-content" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.152490 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="409a1f91-aaba-48bc-85ab-af8da08dabb9" containerName="registry-server" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.153262 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565680-jffxr" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.155785 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.156798 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.157657 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.174796 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565680-jffxr"] Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.248275 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcvkr\" (UniqueName: \"kubernetes.io/projected/669c77dd-b97a-4886-8706-6cdeab3e29d6-kube-api-access-mcvkr\") pod \"auto-csr-approver-29565680-jffxr\" (UID: \"669c77dd-b97a-4886-8706-6cdeab3e29d6\") " pod="openshift-infra/auto-csr-approver-29565680-jffxr" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.350896 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcvkr\" (UniqueName: \"kubernetes.io/projected/669c77dd-b97a-4886-8706-6cdeab3e29d6-kube-api-access-mcvkr\") pod \"auto-csr-approver-29565680-jffxr\" (UID: \"669c77dd-b97a-4886-8706-6cdeab3e29d6\") " pod="openshift-infra/auto-csr-approver-29565680-jffxr" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.372378 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcvkr\" (UniqueName: \"kubernetes.io/projected/669c77dd-b97a-4886-8706-6cdeab3e29d6-kube-api-access-mcvkr\") pod \"auto-csr-approver-29565680-jffxr\" (UID: \"669c77dd-b97a-4886-8706-6cdeab3e29d6\") " pod="openshift-infra/auto-csr-approver-29565680-jffxr" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.473807 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565680-jffxr" Mar 19 17:20:00 crc kubenswrapper[4918]: I0319 17:20:00.955776 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565680-jffxr"] Mar 19 17:20:00 crc kubenswrapper[4918]: W0319 17:20:00.959122 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod669c77dd_b97a_4886_8706_6cdeab3e29d6.slice/crio-c08f0626d3f450d0e1be5fffb34f3d537157b31050991aff3da3eb645458d619 WatchSource:0}: Error finding container c08f0626d3f450d0e1be5fffb34f3d537157b31050991aff3da3eb645458d619: Status 404 returned error can't find the container with id c08f0626d3f450d0e1be5fffb34f3d537157b31050991aff3da3eb645458d619 Mar 19 17:20:01 crc kubenswrapper[4918]: I0319 17:20:01.262856 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565680-jffxr" event={"ID":"669c77dd-b97a-4886-8706-6cdeab3e29d6","Type":"ContainerStarted","Data":"c08f0626d3f450d0e1be5fffb34f3d537157b31050991aff3da3eb645458d619"} Mar 19 17:20:03 crc kubenswrapper[4918]: I0319 17:20:03.284052 4918 generic.go:334] "Generic (PLEG): container finished" podID="669c77dd-b97a-4886-8706-6cdeab3e29d6" containerID="89922f68d6efe6770f0c523005156cc36a073de43077f58c0c9987a8e0246a7b" exitCode=0 Mar 19 17:20:03 crc kubenswrapper[4918]: I0319 17:20:03.284479 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565680-jffxr" event={"ID":"669c77dd-b97a-4886-8706-6cdeab3e29d6","Type":"ContainerDied","Data":"89922f68d6efe6770f0c523005156cc36a073de43077f58c0c9987a8e0246a7b"} Mar 19 17:20:04 crc kubenswrapper[4918]: I0319 17:20:04.724987 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565680-jffxr" Mar 19 17:20:04 crc kubenswrapper[4918]: I0319 17:20:04.845873 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcvkr\" (UniqueName: \"kubernetes.io/projected/669c77dd-b97a-4886-8706-6cdeab3e29d6-kube-api-access-mcvkr\") pod \"669c77dd-b97a-4886-8706-6cdeab3e29d6\" (UID: \"669c77dd-b97a-4886-8706-6cdeab3e29d6\") " Mar 19 17:20:04 crc kubenswrapper[4918]: I0319 17:20:04.851130 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/669c77dd-b97a-4886-8706-6cdeab3e29d6-kube-api-access-mcvkr" (OuterVolumeSpecName: "kube-api-access-mcvkr") pod "669c77dd-b97a-4886-8706-6cdeab3e29d6" (UID: "669c77dd-b97a-4886-8706-6cdeab3e29d6"). InnerVolumeSpecName "kube-api-access-mcvkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:20:04 crc kubenswrapper[4918]: I0319 17:20:04.948329 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcvkr\" (UniqueName: \"kubernetes.io/projected/669c77dd-b97a-4886-8706-6cdeab3e29d6-kube-api-access-mcvkr\") on node \"crc\" DevicePath \"\"" Mar 19 17:20:05 crc kubenswrapper[4918]: I0319 17:20:05.307245 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565680-jffxr" event={"ID":"669c77dd-b97a-4886-8706-6cdeab3e29d6","Type":"ContainerDied","Data":"c08f0626d3f450d0e1be5fffb34f3d537157b31050991aff3da3eb645458d619"} Mar 19 17:20:05 crc kubenswrapper[4918]: I0319 17:20:05.307287 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c08f0626d3f450d0e1be5fffb34f3d537157b31050991aff3da3eb645458d619" Mar 19 17:20:05 crc kubenswrapper[4918]: I0319 17:20:05.307378 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565680-jffxr" Mar 19 17:20:05 crc kubenswrapper[4918]: I0319 17:20:05.804638 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565674-kcgq5"] Mar 19 17:20:05 crc kubenswrapper[4918]: I0319 17:20:05.816061 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565674-kcgq5"] Mar 19 17:20:06 crc kubenswrapper[4918]: I0319 17:20:06.615573 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4012bc5-f224-4c07-aca2-334f6137ed4b" path="/var/lib/kubelet/pods/e4012bc5-f224-4c07-aca2-334f6137ed4b/volumes" Mar 19 17:20:11 crc kubenswrapper[4918]: I0319 17:20:11.587069 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:20:11 crc kubenswrapper[4918]: E0319 17:20:11.587957 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:20:24 crc kubenswrapper[4918]: I0319 17:20:24.587011 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:20:24 crc kubenswrapper[4918]: E0319 17:20:24.588254 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:20:38 crc kubenswrapper[4918]: I0319 17:20:38.596468 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:20:38 crc kubenswrapper[4918]: E0319 17:20:38.597709 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:20:52 crc kubenswrapper[4918]: I0319 17:20:52.587665 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:20:52 crc kubenswrapper[4918]: E0319 17:20:52.589244 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:21:05 crc kubenswrapper[4918]: I0319 17:21:05.509966 4918 scope.go:117] "RemoveContainer" containerID="3aeef489e324ffeb7922cd7d0b4867896990f6941942a572d14c42e058b43290" Mar 19 17:21:07 crc kubenswrapper[4918]: I0319 17:21:07.587414 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:21:07 crc kubenswrapper[4918]: E0319 17:21:07.588126 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:21:22 crc kubenswrapper[4918]: I0319 17:21:22.586853 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:21:22 crc kubenswrapper[4918]: E0319 17:21:22.588017 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:21:33 crc kubenswrapper[4918]: I0319 17:21:33.587758 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:21:33 crc kubenswrapper[4918]: E0319 17:21:33.588959 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:21:36 crc kubenswrapper[4918]: I0319 17:21:36.377587 4918 generic.go:334] "Generic (PLEG): container finished" podID="9cc96cf8-b975-4f49-8032-bb1d31580e7b" containerID="9f0e20457351871a003e0dff1b43a4b4883500d7cf766459355369e34e2e51ba" exitCode=0 Mar 19 17:21:36 crc kubenswrapper[4918]: I0319 17:21:36.377713 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" event={"ID":"9cc96cf8-b975-4f49-8032-bb1d31580e7b","Type":"ContainerDied","Data":"9f0e20457351871a003e0dff1b43a4b4883500d7cf766459355369e34e2e51ba"} Mar 19 17:21:37 crc kubenswrapper[4918]: I0319 17:21:37.889085 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.009755 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-secret-0\") pod \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.009818 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8828m\" (UniqueName: \"kubernetes.io/projected/9cc96cf8-b975-4f49-8032-bb1d31580e7b-kube-api-access-8828m\") pod \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.009992 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-ssh-key-openstack-edpm-ipam\") pod \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.010053 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-combined-ca-bundle\") pod \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.010107 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-inventory\") pod \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\" (UID: \"9cc96cf8-b975-4f49-8032-bb1d31580e7b\") " Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.015334 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc96cf8-b975-4f49-8032-bb1d31580e7b-kube-api-access-8828m" (OuterVolumeSpecName: "kube-api-access-8828m") pod "9cc96cf8-b975-4f49-8032-bb1d31580e7b" (UID: "9cc96cf8-b975-4f49-8032-bb1d31580e7b"). InnerVolumeSpecName "kube-api-access-8828m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.015938 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9cc96cf8-b975-4f49-8032-bb1d31580e7b" (UID: "9cc96cf8-b975-4f49-8032-bb1d31580e7b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.044730 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-inventory" (OuterVolumeSpecName: "inventory") pod "9cc96cf8-b975-4f49-8032-bb1d31580e7b" (UID: "9cc96cf8-b975-4f49-8032-bb1d31580e7b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.048151 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "9cc96cf8-b975-4f49-8032-bb1d31580e7b" (UID: "9cc96cf8-b975-4f49-8032-bb1d31580e7b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.074069 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9cc96cf8-b975-4f49-8032-bb1d31580e7b" (UID: "9cc96cf8-b975-4f49-8032-bb1d31580e7b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.112997 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.113388 4918 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.113432 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.113475 4918 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9cc96cf8-b975-4f49-8032-bb1d31580e7b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.113487 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8828m\" (UniqueName: \"kubernetes.io/projected/9cc96cf8-b975-4f49-8032-bb1d31580e7b-kube-api-access-8828m\") on node \"crc\" DevicePath \"\"" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.399512 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" event={"ID":"9cc96cf8-b975-4f49-8032-bb1d31580e7b","Type":"ContainerDied","Data":"06bf81c5cb05d024a475287c48cddaf9e442d77fc2a3e045d748909c1f2eff2a"} Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.399593 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06bf81c5cb05d024a475287c48cddaf9e442d77fc2a3e045d748909c1f2eff2a" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.399999 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.523726 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5"] Mar 19 17:21:38 crc kubenswrapper[4918]: E0319 17:21:38.524306 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc96cf8-b975-4f49-8032-bb1d31580e7b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.524332 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc96cf8-b975-4f49-8032-bb1d31580e7b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 17:21:38 crc kubenswrapper[4918]: E0319 17:21:38.524362 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="669c77dd-b97a-4886-8706-6cdeab3e29d6" containerName="oc" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.524371 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="669c77dd-b97a-4886-8706-6cdeab3e29d6" containerName="oc" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.524701 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc96cf8-b975-4f49-8032-bb1d31580e7b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.524729 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="669c77dd-b97a-4886-8706-6cdeab3e29d6" containerName="oc" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.525692 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.529844 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.530207 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.530790 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.532234 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.532690 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.532942 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.532981 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.537159 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5"] Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.623236 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.623558 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.623662 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.623860 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.624000 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.624156 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndlm4\" (UniqueName: \"kubernetes.io/projected/e21d159f-fdc3-48bf-b40b-5bda64316b5e-kube-api-access-ndlm4\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.624257 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.624324 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.624410 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.624544 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.624627 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.726828 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.726933 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.726979 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.727000 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.727059 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.727125 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.727160 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndlm4\" (UniqueName: \"kubernetes.io/projected/e21d159f-fdc3-48bf-b40b-5bda64316b5e-kube-api-access-ndlm4\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.727192 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.727212 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.727230 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.727292 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.729223 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.731463 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.731707 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.732024 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.732445 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.732666 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.733850 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.734150 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.734462 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.735147 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.754032 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndlm4\" (UniqueName: \"kubernetes.io/projected/e21d159f-fdc3-48bf-b40b-5bda64316b5e-kube-api-access-ndlm4\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cbbl5\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:38 crc kubenswrapper[4918]: I0319 17:21:38.850037 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:21:39 crc kubenswrapper[4918]: I0319 17:21:39.461089 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5"] Mar 19 17:21:40 crc kubenswrapper[4918]: I0319 17:21:40.425803 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" event={"ID":"e21d159f-fdc3-48bf-b40b-5bda64316b5e","Type":"ContainerStarted","Data":"0a119c23924a731c2cfc52791a81ae7be39ca2d90d6d902bd4f272a9ffb2e592"} Mar 19 17:21:40 crc kubenswrapper[4918]: I0319 17:21:40.426168 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" event={"ID":"e21d159f-fdc3-48bf-b40b-5bda64316b5e","Type":"ContainerStarted","Data":"186b31e6a4f69b1c0cc3daae15790e160acc474d00ff04fb491bf4f656f955d4"} Mar 19 17:21:40 crc kubenswrapper[4918]: I0319 17:21:40.461642 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" podStartSLOduration=2.027730405 podStartE2EDuration="2.461615801s" podCreationTimestamp="2026-03-19 17:21:38 +0000 UTC" firstStartedPulling="2026-03-19 17:21:39.457100038 +0000 UTC m=+2511.579299306" lastFinishedPulling="2026-03-19 17:21:39.890985444 +0000 UTC m=+2512.013184702" observedRunningTime="2026-03-19 17:21:40.448870866 +0000 UTC m=+2512.571070124" watchObservedRunningTime="2026-03-19 17:21:40.461615801 +0000 UTC m=+2512.583815069" Mar 19 17:21:46 crc kubenswrapper[4918]: I0319 17:21:46.587104 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:21:46 crc kubenswrapper[4918]: E0319 17:21:46.587946 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:22:00 crc kubenswrapper[4918]: I0319 17:22:00.153153 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565682-mv97p"] Mar 19 17:22:00 crc kubenswrapper[4918]: I0319 17:22:00.155439 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565682-mv97p" Mar 19 17:22:00 crc kubenswrapper[4918]: I0319 17:22:00.158141 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:22:00 crc kubenswrapper[4918]: I0319 17:22:00.158502 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:22:00 crc kubenswrapper[4918]: I0319 17:22:00.158906 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:22:00 crc kubenswrapper[4918]: I0319 17:22:00.171180 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565682-mv97p"] Mar 19 17:22:00 crc kubenswrapper[4918]: I0319 17:22:00.280906 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6qvx\" (UniqueName: \"kubernetes.io/projected/fe05a8c7-498f-468f-9b97-535017a6c66c-kube-api-access-x6qvx\") pod \"auto-csr-approver-29565682-mv97p\" (UID: \"fe05a8c7-498f-468f-9b97-535017a6c66c\") " pod="openshift-infra/auto-csr-approver-29565682-mv97p" Mar 19 17:22:00 crc kubenswrapper[4918]: I0319 17:22:00.383018 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6qvx\" (UniqueName: \"kubernetes.io/projected/fe05a8c7-498f-468f-9b97-535017a6c66c-kube-api-access-x6qvx\") pod \"auto-csr-approver-29565682-mv97p\" (UID: \"fe05a8c7-498f-468f-9b97-535017a6c66c\") " pod="openshift-infra/auto-csr-approver-29565682-mv97p" Mar 19 17:22:00 crc kubenswrapper[4918]: I0319 17:22:00.410571 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6qvx\" (UniqueName: \"kubernetes.io/projected/fe05a8c7-498f-468f-9b97-535017a6c66c-kube-api-access-x6qvx\") pod \"auto-csr-approver-29565682-mv97p\" (UID: \"fe05a8c7-498f-468f-9b97-535017a6c66c\") " pod="openshift-infra/auto-csr-approver-29565682-mv97p" Mar 19 17:22:00 crc kubenswrapper[4918]: I0319 17:22:00.475402 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565682-mv97p" Mar 19 17:22:01 crc kubenswrapper[4918]: I0319 17:22:01.002006 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565682-mv97p"] Mar 19 17:22:01 crc kubenswrapper[4918]: I0319 17:22:01.586777 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:22:01 crc kubenswrapper[4918]: E0319 17:22:01.588190 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:22:01 crc kubenswrapper[4918]: I0319 17:22:01.664395 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565682-mv97p" event={"ID":"fe05a8c7-498f-468f-9b97-535017a6c66c","Type":"ContainerStarted","Data":"b1622a71b08ed05ab6e2db0a57d350713a9e64591a1d8186f4afb441114ae346"} Mar 19 17:22:02 crc kubenswrapper[4918]: I0319 17:22:02.675614 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565682-mv97p" event={"ID":"fe05a8c7-498f-468f-9b97-535017a6c66c","Type":"ContainerStarted","Data":"2faf5aff0f250f6cef4aad2075ef1c275b08984a26fd3ae204422a23708c346f"} Mar 19 17:22:02 crc kubenswrapper[4918]: I0319 17:22:02.699045 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565682-mv97p" podStartSLOduration=1.522235931 podStartE2EDuration="2.699021001s" podCreationTimestamp="2026-03-19 17:22:00 +0000 UTC" firstStartedPulling="2026-03-19 17:22:01.00637032 +0000 UTC m=+2533.128569568" lastFinishedPulling="2026-03-19 17:22:02.18315539 +0000 UTC m=+2534.305354638" observedRunningTime="2026-03-19 17:22:02.689155673 +0000 UTC m=+2534.811354921" watchObservedRunningTime="2026-03-19 17:22:02.699021001 +0000 UTC m=+2534.821220259" Mar 19 17:22:03 crc kubenswrapper[4918]: I0319 17:22:03.686142 4918 generic.go:334] "Generic (PLEG): container finished" podID="fe05a8c7-498f-468f-9b97-535017a6c66c" containerID="2faf5aff0f250f6cef4aad2075ef1c275b08984a26fd3ae204422a23708c346f" exitCode=0 Mar 19 17:22:03 crc kubenswrapper[4918]: I0319 17:22:03.686365 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565682-mv97p" event={"ID":"fe05a8c7-498f-468f-9b97-535017a6c66c","Type":"ContainerDied","Data":"2faf5aff0f250f6cef4aad2075ef1c275b08984a26fd3ae204422a23708c346f"} Mar 19 17:22:05 crc kubenswrapper[4918]: I0319 17:22:05.159315 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565682-mv97p" Mar 19 17:22:05 crc kubenswrapper[4918]: I0319 17:22:05.291007 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6qvx\" (UniqueName: \"kubernetes.io/projected/fe05a8c7-498f-468f-9b97-535017a6c66c-kube-api-access-x6qvx\") pod \"fe05a8c7-498f-468f-9b97-535017a6c66c\" (UID: \"fe05a8c7-498f-468f-9b97-535017a6c66c\") " Mar 19 17:22:05 crc kubenswrapper[4918]: I0319 17:22:05.297806 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe05a8c7-498f-468f-9b97-535017a6c66c-kube-api-access-x6qvx" (OuterVolumeSpecName: "kube-api-access-x6qvx") pod "fe05a8c7-498f-468f-9b97-535017a6c66c" (UID: "fe05a8c7-498f-468f-9b97-535017a6c66c"). InnerVolumeSpecName "kube-api-access-x6qvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:22:05 crc kubenswrapper[4918]: I0319 17:22:05.393955 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6qvx\" (UniqueName: \"kubernetes.io/projected/fe05a8c7-498f-468f-9b97-535017a6c66c-kube-api-access-x6qvx\") on node \"crc\" DevicePath \"\"" Mar 19 17:22:05 crc kubenswrapper[4918]: I0319 17:22:05.709030 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565682-mv97p" event={"ID":"fe05a8c7-498f-468f-9b97-535017a6c66c","Type":"ContainerDied","Data":"b1622a71b08ed05ab6e2db0a57d350713a9e64591a1d8186f4afb441114ae346"} Mar 19 17:22:05 crc kubenswrapper[4918]: I0319 17:22:05.709082 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1622a71b08ed05ab6e2db0a57d350713a9e64591a1d8186f4afb441114ae346" Mar 19 17:22:05 crc kubenswrapper[4918]: I0319 17:22:05.709146 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565682-mv97p" Mar 19 17:22:05 crc kubenswrapper[4918]: I0319 17:22:05.774011 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565676-c6kk8"] Mar 19 17:22:05 crc kubenswrapper[4918]: I0319 17:22:05.785544 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565676-c6kk8"] Mar 19 17:22:06 crc kubenswrapper[4918]: I0319 17:22:06.602342 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10685f39-0b1e-4912-812f-39193409fdbd" path="/var/lib/kubelet/pods/10685f39-0b1e-4912-812f-39193409fdbd/volumes" Mar 19 17:22:13 crc kubenswrapper[4918]: I0319 17:22:13.586906 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:22:13 crc kubenswrapper[4918]: E0319 17:22:13.588112 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:22:24 crc kubenswrapper[4918]: I0319 17:22:24.587744 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:22:24 crc kubenswrapper[4918]: E0319 17:22:24.588778 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:22:39 crc kubenswrapper[4918]: I0319 17:22:39.586716 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:22:40 crc kubenswrapper[4918]: I0319 17:22:40.126236 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"c98bb004b2c85bfe02cdd8cceb684417f8e23df7690967e86e647f83c8c1c57f"} Mar 19 17:23:05 crc kubenswrapper[4918]: I0319 17:23:05.671329 4918 scope.go:117] "RemoveContainer" containerID="6674d7d5abca04ca2789cb15c2bb77ced4a522eb3572a3300465646d8837e94a" Mar 19 17:23:11 crc kubenswrapper[4918]: I0319 17:23:11.868181 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f8r4g"] Mar 19 17:23:11 crc kubenswrapper[4918]: E0319 17:23:11.869138 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe05a8c7-498f-468f-9b97-535017a6c66c" containerName="oc" Mar 19 17:23:11 crc kubenswrapper[4918]: I0319 17:23:11.869149 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe05a8c7-498f-468f-9b97-535017a6c66c" containerName="oc" Mar 19 17:23:11 crc kubenswrapper[4918]: I0319 17:23:11.869356 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe05a8c7-498f-468f-9b97-535017a6c66c" containerName="oc" Mar 19 17:23:11 crc kubenswrapper[4918]: I0319 17:23:11.870935 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:11 crc kubenswrapper[4918]: I0319 17:23:11.887153 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8r4g"] Mar 19 17:23:11 crc kubenswrapper[4918]: I0319 17:23:11.973572 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-utilities\") pod \"community-operators-f8r4g\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:11 crc kubenswrapper[4918]: I0319 17:23:11.973629 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-catalog-content\") pod \"community-operators-f8r4g\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:11 crc kubenswrapper[4918]: I0319 17:23:11.973662 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wgzx\" (UniqueName: \"kubernetes.io/projected/eb35908d-225b-4d53-9545-fbae9b327616-kube-api-access-8wgzx\") pod \"community-operators-f8r4g\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:12 crc kubenswrapper[4918]: I0319 17:23:12.075659 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-utilities\") pod \"community-operators-f8r4g\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:12 crc kubenswrapper[4918]: I0319 17:23:12.075717 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-catalog-content\") pod \"community-operators-f8r4g\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:12 crc kubenswrapper[4918]: I0319 17:23:12.075751 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wgzx\" (UniqueName: \"kubernetes.io/projected/eb35908d-225b-4d53-9545-fbae9b327616-kube-api-access-8wgzx\") pod \"community-operators-f8r4g\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:12 crc kubenswrapper[4918]: I0319 17:23:12.076297 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-utilities\") pod \"community-operators-f8r4g\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:12 crc kubenswrapper[4918]: I0319 17:23:12.076309 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-catalog-content\") pod \"community-operators-f8r4g\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:12 crc kubenswrapper[4918]: I0319 17:23:12.098811 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wgzx\" (UniqueName: \"kubernetes.io/projected/eb35908d-225b-4d53-9545-fbae9b327616-kube-api-access-8wgzx\") pod \"community-operators-f8r4g\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:12 crc kubenswrapper[4918]: I0319 17:23:12.205589 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:12 crc kubenswrapper[4918]: I0319 17:23:12.761848 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8r4g"] Mar 19 17:23:13 crc kubenswrapper[4918]: I0319 17:23:13.508535 4918 generic.go:334] "Generic (PLEG): container finished" podID="eb35908d-225b-4d53-9545-fbae9b327616" containerID="9e2b74237437c105f98ace1fac6a1e1b5d32669cf19de648d02bda86c1409e42" exitCode=0 Mar 19 17:23:13 crc kubenswrapper[4918]: I0319 17:23:13.508635 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8r4g" event={"ID":"eb35908d-225b-4d53-9545-fbae9b327616","Type":"ContainerDied","Data":"9e2b74237437c105f98ace1fac6a1e1b5d32669cf19de648d02bda86c1409e42"} Mar 19 17:23:13 crc kubenswrapper[4918]: I0319 17:23:13.508852 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8r4g" event={"ID":"eb35908d-225b-4d53-9545-fbae9b327616","Type":"ContainerStarted","Data":"50cf3a75a157a4d196489d20d31d48dfa50b86d5d7a275ab805a5d9de7d09e87"} Mar 19 17:23:14 crc kubenswrapper[4918]: I0319 17:23:14.522410 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8r4g" event={"ID":"eb35908d-225b-4d53-9545-fbae9b327616","Type":"ContainerStarted","Data":"e9fb9e353348d5b1141e3d2aa47fa1c4daf0cb6a33015542386b6f73fd5c7430"} Mar 19 17:23:16 crc kubenswrapper[4918]: I0319 17:23:16.542164 4918 generic.go:334] "Generic (PLEG): container finished" podID="eb35908d-225b-4d53-9545-fbae9b327616" containerID="e9fb9e353348d5b1141e3d2aa47fa1c4daf0cb6a33015542386b6f73fd5c7430" exitCode=0 Mar 19 17:23:16 crc kubenswrapper[4918]: I0319 17:23:16.542287 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8r4g" event={"ID":"eb35908d-225b-4d53-9545-fbae9b327616","Type":"ContainerDied","Data":"e9fb9e353348d5b1141e3d2aa47fa1c4daf0cb6a33015542386b6f73fd5c7430"} Mar 19 17:23:17 crc kubenswrapper[4918]: I0319 17:23:17.559368 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8r4g" event={"ID":"eb35908d-225b-4d53-9545-fbae9b327616","Type":"ContainerStarted","Data":"0e631bb0ab67025433aecc13c1b9becb3d0c49f67f2aaf8ad6562573b0114996"} Mar 19 17:23:17 crc kubenswrapper[4918]: I0319 17:23:17.584088 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f8r4g" podStartSLOduration=3.164403249 podStartE2EDuration="6.584066934s" podCreationTimestamp="2026-03-19 17:23:11 +0000 UTC" firstStartedPulling="2026-03-19 17:23:13.511185991 +0000 UTC m=+2605.633385239" lastFinishedPulling="2026-03-19 17:23:16.930849676 +0000 UTC m=+2609.053048924" observedRunningTime="2026-03-19 17:23:17.580496208 +0000 UTC m=+2609.702695446" watchObservedRunningTime="2026-03-19 17:23:17.584066934 +0000 UTC m=+2609.706266202" Mar 19 17:23:22 crc kubenswrapper[4918]: I0319 17:23:22.205794 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:22 crc kubenswrapper[4918]: I0319 17:23:22.206648 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:22 crc kubenswrapper[4918]: I0319 17:23:22.285207 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:22 crc kubenswrapper[4918]: I0319 17:23:22.673991 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:22 crc kubenswrapper[4918]: I0319 17:23:22.731681 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8r4g"] Mar 19 17:23:24 crc kubenswrapper[4918]: I0319 17:23:24.645077 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f8r4g" podUID="eb35908d-225b-4d53-9545-fbae9b327616" containerName="registry-server" containerID="cri-o://0e631bb0ab67025433aecc13c1b9becb3d0c49f67f2aaf8ad6562573b0114996" gracePeriod=2 Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.655900 4918 generic.go:334] "Generic (PLEG): container finished" podID="eb35908d-225b-4d53-9545-fbae9b327616" containerID="0e631bb0ab67025433aecc13c1b9becb3d0c49f67f2aaf8ad6562573b0114996" exitCode=0 Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.655955 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8r4g" event={"ID":"eb35908d-225b-4d53-9545-fbae9b327616","Type":"ContainerDied","Data":"0e631bb0ab67025433aecc13c1b9becb3d0c49f67f2aaf8ad6562573b0114996"} Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.656270 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8r4g" event={"ID":"eb35908d-225b-4d53-9545-fbae9b327616","Type":"ContainerDied","Data":"50cf3a75a157a4d196489d20d31d48dfa50b86d5d7a275ab805a5d9de7d09e87"} Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.656290 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50cf3a75a157a4d196489d20d31d48dfa50b86d5d7a275ab805a5d9de7d09e87" Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.685782 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.725722 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-utilities\") pod \"eb35908d-225b-4d53-9545-fbae9b327616\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.725921 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wgzx\" (UniqueName: \"kubernetes.io/projected/eb35908d-225b-4d53-9545-fbae9b327616-kube-api-access-8wgzx\") pod \"eb35908d-225b-4d53-9545-fbae9b327616\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.728852 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-utilities" (OuterVolumeSpecName: "utilities") pod "eb35908d-225b-4d53-9545-fbae9b327616" (UID: "eb35908d-225b-4d53-9545-fbae9b327616"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.737051 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-catalog-content\") pod \"eb35908d-225b-4d53-9545-fbae9b327616\" (UID: \"eb35908d-225b-4d53-9545-fbae9b327616\") " Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.738653 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.739801 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb35908d-225b-4d53-9545-fbae9b327616-kube-api-access-8wgzx" (OuterVolumeSpecName: "kube-api-access-8wgzx") pod "eb35908d-225b-4d53-9545-fbae9b327616" (UID: "eb35908d-225b-4d53-9545-fbae9b327616"). InnerVolumeSpecName "kube-api-access-8wgzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.800005 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb35908d-225b-4d53-9545-fbae9b327616" (UID: "eb35908d-225b-4d53-9545-fbae9b327616"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.840888 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wgzx\" (UniqueName: \"kubernetes.io/projected/eb35908d-225b-4d53-9545-fbae9b327616-kube-api-access-8wgzx\") on node \"crc\" DevicePath \"\"" Mar 19 17:23:25 crc kubenswrapper[4918]: I0319 17:23:25.841143 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb35908d-225b-4d53-9545-fbae9b327616-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:23:26 crc kubenswrapper[4918]: I0319 17:23:26.670966 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8r4g" Mar 19 17:23:26 crc kubenswrapper[4918]: I0319 17:23:26.701931 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8r4g"] Mar 19 17:23:26 crc kubenswrapper[4918]: I0319 17:23:26.719021 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f8r4g"] Mar 19 17:23:28 crc kubenswrapper[4918]: I0319 17:23:28.603632 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb35908d-225b-4d53-9545-fbae9b327616" path="/var/lib/kubelet/pods/eb35908d-225b-4d53-9545-fbae9b327616/volumes" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.152873 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565684-fm8j8"] Mar 19 17:24:00 crc kubenswrapper[4918]: E0319 17:24:00.155131 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb35908d-225b-4d53-9545-fbae9b327616" containerName="registry-server" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.155255 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb35908d-225b-4d53-9545-fbae9b327616" containerName="registry-server" Mar 19 17:24:00 crc kubenswrapper[4918]: E0319 17:24:00.155361 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb35908d-225b-4d53-9545-fbae9b327616" containerName="extract-content" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.155435 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb35908d-225b-4d53-9545-fbae9b327616" containerName="extract-content" Mar 19 17:24:00 crc kubenswrapper[4918]: E0319 17:24:00.155556 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb35908d-225b-4d53-9545-fbae9b327616" containerName="extract-utilities" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.155635 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb35908d-225b-4d53-9545-fbae9b327616" containerName="extract-utilities" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.155999 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb35908d-225b-4d53-9545-fbae9b327616" containerName="registry-server" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.157163 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565684-fm8j8" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.159584 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.159674 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.162817 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565684-fm8j8"] Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.165820 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.314068 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cjz8\" (UniqueName: \"kubernetes.io/projected/083dfc61-7262-4798-8446-6539ad70e6c7-kube-api-access-7cjz8\") pod \"auto-csr-approver-29565684-fm8j8\" (UID: \"083dfc61-7262-4798-8446-6539ad70e6c7\") " pod="openshift-infra/auto-csr-approver-29565684-fm8j8" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.416029 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cjz8\" (UniqueName: \"kubernetes.io/projected/083dfc61-7262-4798-8446-6539ad70e6c7-kube-api-access-7cjz8\") pod \"auto-csr-approver-29565684-fm8j8\" (UID: \"083dfc61-7262-4798-8446-6539ad70e6c7\") " pod="openshift-infra/auto-csr-approver-29565684-fm8j8" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.434173 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cjz8\" (UniqueName: \"kubernetes.io/projected/083dfc61-7262-4798-8446-6539ad70e6c7-kube-api-access-7cjz8\") pod \"auto-csr-approver-29565684-fm8j8\" (UID: \"083dfc61-7262-4798-8446-6539ad70e6c7\") " pod="openshift-infra/auto-csr-approver-29565684-fm8j8" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.480281 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565684-fm8j8" Mar 19 17:24:00 crc kubenswrapper[4918]: I0319 17:24:00.912004 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565684-fm8j8"] Mar 19 17:24:00 crc kubenswrapper[4918]: W0319 17:24:00.920632 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod083dfc61_7262_4798_8446_6539ad70e6c7.slice/crio-26c053e73638c7c6b941ee166a8a75ecb9faa825502d412b3381569a8cfa1776 WatchSource:0}: Error finding container 26c053e73638c7c6b941ee166a8a75ecb9faa825502d412b3381569a8cfa1776: Status 404 returned error can't find the container with id 26c053e73638c7c6b941ee166a8a75ecb9faa825502d412b3381569a8cfa1776 Mar 19 17:24:01 crc kubenswrapper[4918]: I0319 17:24:01.022862 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565684-fm8j8" event={"ID":"083dfc61-7262-4798-8446-6539ad70e6c7","Type":"ContainerStarted","Data":"26c053e73638c7c6b941ee166a8a75ecb9faa825502d412b3381569a8cfa1776"} Mar 19 17:24:03 crc kubenswrapper[4918]: I0319 17:24:03.086605 4918 generic.go:334] "Generic (PLEG): container finished" podID="083dfc61-7262-4798-8446-6539ad70e6c7" containerID="3a36dde2dafa342ff58e60b15d7a78f74b20e7fc485b991ead124dda1d032a2c" exitCode=0 Mar 19 17:24:03 crc kubenswrapper[4918]: I0319 17:24:03.086714 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565684-fm8j8" event={"ID":"083dfc61-7262-4798-8446-6539ad70e6c7","Type":"ContainerDied","Data":"3a36dde2dafa342ff58e60b15d7a78f74b20e7fc485b991ead124dda1d032a2c"} Mar 19 17:24:04 crc kubenswrapper[4918]: I0319 17:24:04.568332 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565684-fm8j8" Mar 19 17:24:04 crc kubenswrapper[4918]: I0319 17:24:04.723674 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cjz8\" (UniqueName: \"kubernetes.io/projected/083dfc61-7262-4798-8446-6539ad70e6c7-kube-api-access-7cjz8\") pod \"083dfc61-7262-4798-8446-6539ad70e6c7\" (UID: \"083dfc61-7262-4798-8446-6539ad70e6c7\") " Mar 19 17:24:04 crc kubenswrapper[4918]: I0319 17:24:04.730079 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/083dfc61-7262-4798-8446-6539ad70e6c7-kube-api-access-7cjz8" (OuterVolumeSpecName: "kube-api-access-7cjz8") pod "083dfc61-7262-4798-8446-6539ad70e6c7" (UID: "083dfc61-7262-4798-8446-6539ad70e6c7"). InnerVolumeSpecName "kube-api-access-7cjz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:24:04 crc kubenswrapper[4918]: I0319 17:24:04.826974 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cjz8\" (UniqueName: \"kubernetes.io/projected/083dfc61-7262-4798-8446-6539ad70e6c7-kube-api-access-7cjz8\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:05 crc kubenswrapper[4918]: I0319 17:24:05.129270 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565684-fm8j8" event={"ID":"083dfc61-7262-4798-8446-6539ad70e6c7","Type":"ContainerDied","Data":"26c053e73638c7c6b941ee166a8a75ecb9faa825502d412b3381569a8cfa1776"} Mar 19 17:24:05 crc kubenswrapper[4918]: I0319 17:24:05.129511 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26c053e73638c7c6b941ee166a8a75ecb9faa825502d412b3381569a8cfa1776" Mar 19 17:24:05 crc kubenswrapper[4918]: I0319 17:24:05.129360 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565684-fm8j8" Mar 19 17:24:05 crc kubenswrapper[4918]: I0319 17:24:05.665386 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565678-zrq4r"] Mar 19 17:24:05 crc kubenswrapper[4918]: I0319 17:24:05.678923 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565678-zrq4r"] Mar 19 17:24:06 crc kubenswrapper[4918]: I0319 17:24:06.606883 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e546ea-cea1-49e7-b212-797cd0679cd0" path="/var/lib/kubelet/pods/19e546ea-cea1-49e7-b212-797cd0679cd0/volumes" Mar 19 17:24:08 crc kubenswrapper[4918]: E0319 17:24:08.497145 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode21d159f_fdc3_48bf_b40b_5bda64316b5e.slice/crio-0a119c23924a731c2cfc52791a81ae7be39ca2d90d6d902bd4f272a9ffb2e592.scope\": RecentStats: unable to find data in memory cache]" Mar 19 17:24:09 crc kubenswrapper[4918]: I0319 17:24:09.178515 4918 generic.go:334] "Generic (PLEG): container finished" podID="e21d159f-fdc3-48bf-b40b-5bda64316b5e" containerID="0a119c23924a731c2cfc52791a81ae7be39ca2d90d6d902bd4f272a9ffb2e592" exitCode=0 Mar 19 17:24:09 crc kubenswrapper[4918]: I0319 17:24:09.178588 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" event={"ID":"e21d159f-fdc3-48bf-b40b-5bda64316b5e","Type":"ContainerDied","Data":"0a119c23924a731c2cfc52791a81ae7be39ca2d90d6d902bd4f272a9ffb2e592"} Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.702589 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.862561 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-combined-ca-bundle\") pod \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.862605 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-2\") pod \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.863245 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-1\") pod \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.863364 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-inventory\") pod \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.863490 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-ssh-key-openstack-edpm-ipam\") pod \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.863567 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndlm4\" (UniqueName: \"kubernetes.io/projected/e21d159f-fdc3-48bf-b40b-5bda64316b5e-kube-api-access-ndlm4\") pod \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.863640 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-extra-config-0\") pod \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.863686 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-0\") pod \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.863757 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-1\") pod \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.863894 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-3\") pod \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.864041 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-0\") pod \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\" (UID: \"e21d159f-fdc3-48bf-b40b-5bda64316b5e\") " Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.868782 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e21d159f-fdc3-48bf-b40b-5bda64316b5e" (UID: "e21d159f-fdc3-48bf-b40b-5bda64316b5e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.873766 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e21d159f-fdc3-48bf-b40b-5bda64316b5e-kube-api-access-ndlm4" (OuterVolumeSpecName: "kube-api-access-ndlm4") pod "e21d159f-fdc3-48bf-b40b-5bda64316b5e" (UID: "e21d159f-fdc3-48bf-b40b-5bda64316b5e"). InnerVolumeSpecName "kube-api-access-ndlm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.891840 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e21d159f-fdc3-48bf-b40b-5bda64316b5e" (UID: "e21d159f-fdc3-48bf-b40b-5bda64316b5e"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.895810 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e21d159f-fdc3-48bf-b40b-5bda64316b5e" (UID: "e21d159f-fdc3-48bf-b40b-5bda64316b5e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.898432 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "e21d159f-fdc3-48bf-b40b-5bda64316b5e" (UID: "e21d159f-fdc3-48bf-b40b-5bda64316b5e"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.901169 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e21d159f-fdc3-48bf-b40b-5bda64316b5e" (UID: "e21d159f-fdc3-48bf-b40b-5bda64316b5e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.906688 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-inventory" (OuterVolumeSpecName: "inventory") pod "e21d159f-fdc3-48bf-b40b-5bda64316b5e" (UID: "e21d159f-fdc3-48bf-b40b-5bda64316b5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.910319 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e21d159f-fdc3-48bf-b40b-5bda64316b5e" (UID: "e21d159f-fdc3-48bf-b40b-5bda64316b5e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.914713 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e21d159f-fdc3-48bf-b40b-5bda64316b5e" (UID: "e21d159f-fdc3-48bf-b40b-5bda64316b5e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.915598 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e21d159f-fdc3-48bf-b40b-5bda64316b5e" (UID: "e21d159f-fdc3-48bf-b40b-5bda64316b5e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.921973 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "e21d159f-fdc3-48bf-b40b-5bda64316b5e" (UID: "e21d159f-fdc3-48bf-b40b-5bda64316b5e"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.966761 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.966795 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.966809 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndlm4\" (UniqueName: \"kubernetes.io/projected/e21d159f-fdc3-48bf-b40b-5bda64316b5e-kube-api-access-ndlm4\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.966822 4918 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.966834 4918 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.966845 4918 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.966856 4918 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.966867 4918 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.966877 4918 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.966889 4918 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:10 crc kubenswrapper[4918]: I0319 17:24:10.966901 4918 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e21d159f-fdc3-48bf-b40b-5bda64316b5e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.201943 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" event={"ID":"e21d159f-fdc3-48bf-b40b-5bda64316b5e","Type":"ContainerDied","Data":"186b31e6a4f69b1c0cc3daae15790e160acc474d00ff04fb491bf4f656f955d4"} Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.201986 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="186b31e6a4f69b1c0cc3daae15790e160acc474d00ff04fb491bf4f656f955d4" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.202039 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cbbl5" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.300574 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc"] Mar 19 17:24:11 crc kubenswrapper[4918]: E0319 17:24:11.301077 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21d159f-fdc3-48bf-b40b-5bda64316b5e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.301100 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21d159f-fdc3-48bf-b40b-5bda64316b5e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 17:24:11 crc kubenswrapper[4918]: E0319 17:24:11.301122 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="083dfc61-7262-4798-8446-6539ad70e6c7" containerName="oc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.301132 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="083dfc61-7262-4798-8446-6539ad70e6c7" containerName="oc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.301384 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="083dfc61-7262-4798-8446-6539ad70e6c7" containerName="oc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.301418 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="e21d159f-fdc3-48bf-b40b-5bda64316b5e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.302387 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.308858 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.308857 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4jldg" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.309030 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.309809 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.311736 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.321860 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc"] Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.477225 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.477343 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.477373 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.477443 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.477502 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.477549 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.477604 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqgsj\" (UniqueName: \"kubernetes.io/projected/7aa831f4-171a-406e-b49f-eb422fb34edc-kube-api-access-nqgsj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.579132 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.579392 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.579457 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.579505 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.579544 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.579598 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqgsj\" (UniqueName: \"kubernetes.io/projected/7aa831f4-171a-406e-b49f-eb422fb34edc-kube-api-access-nqgsj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.579654 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.583855 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.585061 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.585239 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.585972 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.587683 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.590204 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.597614 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqgsj\" (UniqueName: \"kubernetes.io/projected/7aa831f4-171a-406e-b49f-eb422fb34edc-kube-api-access-nqgsj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mpczc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:11 crc kubenswrapper[4918]: I0319 17:24:11.622468 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:24:12 crc kubenswrapper[4918]: I0319 17:24:12.166239 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc"] Mar 19 17:24:12 crc kubenswrapper[4918]: W0319 17:24:12.167735 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aa831f4_171a_406e_b49f_eb422fb34edc.slice/crio-621f9db7c010258cd34f30396fe94a3a01957de100684b7c7f2ca7cb39e5ed7f WatchSource:0}: Error finding container 621f9db7c010258cd34f30396fe94a3a01957de100684b7c7f2ca7cb39e5ed7f: Status 404 returned error can't find the container with id 621f9db7c010258cd34f30396fe94a3a01957de100684b7c7f2ca7cb39e5ed7f Mar 19 17:24:12 crc kubenswrapper[4918]: I0319 17:24:12.213168 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" event={"ID":"7aa831f4-171a-406e-b49f-eb422fb34edc","Type":"ContainerStarted","Data":"621f9db7c010258cd34f30396fe94a3a01957de100684b7c7f2ca7cb39e5ed7f"} Mar 19 17:24:13 crc kubenswrapper[4918]: I0319 17:24:13.226475 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" event={"ID":"7aa831f4-171a-406e-b49f-eb422fb34edc","Type":"ContainerStarted","Data":"5b8378266f2e5e8df26ef6997ca07ab31092fa923997b956834c5413c20fdd22"} Mar 19 17:24:13 crc kubenswrapper[4918]: I0319 17:24:13.249036 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" podStartSLOduration=1.7718471409999998 podStartE2EDuration="2.249018251s" podCreationTimestamp="2026-03-19 17:24:11 +0000 UTC" firstStartedPulling="2026-03-19 17:24:12.170729205 +0000 UTC m=+2664.292928453" lastFinishedPulling="2026-03-19 17:24:12.647900305 +0000 UTC m=+2664.770099563" observedRunningTime="2026-03-19 17:24:13.241059044 +0000 UTC m=+2665.363258292" watchObservedRunningTime="2026-03-19 17:24:13.249018251 +0000 UTC m=+2665.371217499" Mar 19 17:24:34 crc kubenswrapper[4918]: I0319 17:24:34.927709 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vz9hn"] Mar 19 17:24:34 crc kubenswrapper[4918]: I0319 17:24:34.936156 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:34 crc kubenswrapper[4918]: I0319 17:24:34.964917 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vz9hn"] Mar 19 17:24:35 crc kubenswrapper[4918]: I0319 17:24:35.036635 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-catalog-content\") pod \"redhat-marketplace-vz9hn\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:35 crc kubenswrapper[4918]: I0319 17:24:35.036754 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-utilities\") pod \"redhat-marketplace-vz9hn\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:35 crc kubenswrapper[4918]: I0319 17:24:35.037019 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnnxx\" (UniqueName: \"kubernetes.io/projected/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-kube-api-access-dnnxx\") pod \"redhat-marketplace-vz9hn\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:35 crc kubenswrapper[4918]: I0319 17:24:35.138940 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-utilities\") pod \"redhat-marketplace-vz9hn\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:35 crc kubenswrapper[4918]: I0319 17:24:35.139135 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnnxx\" (UniqueName: \"kubernetes.io/projected/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-kube-api-access-dnnxx\") pod \"redhat-marketplace-vz9hn\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:35 crc kubenswrapper[4918]: I0319 17:24:35.139179 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-catalog-content\") pod \"redhat-marketplace-vz9hn\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:35 crc kubenswrapper[4918]: I0319 17:24:35.139698 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-utilities\") pod \"redhat-marketplace-vz9hn\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:35 crc kubenswrapper[4918]: I0319 17:24:35.139826 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-catalog-content\") pod \"redhat-marketplace-vz9hn\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:35 crc kubenswrapper[4918]: I0319 17:24:35.160451 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnnxx\" (UniqueName: \"kubernetes.io/projected/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-kube-api-access-dnnxx\") pod \"redhat-marketplace-vz9hn\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:35 crc kubenswrapper[4918]: I0319 17:24:35.276543 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:35 crc kubenswrapper[4918]: I0319 17:24:35.762003 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vz9hn"] Mar 19 17:24:35 crc kubenswrapper[4918]: W0319 17:24:35.774625 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice/crio-b2687c1fe799e3239998367b09fee290d311d5071842d64f2ffa1f277635d8dd WatchSource:0}: Error finding container b2687c1fe799e3239998367b09fee290d311d5071842d64f2ffa1f277635d8dd: Status 404 returned error can't find the container with id b2687c1fe799e3239998367b09fee290d311d5071842d64f2ffa1f277635d8dd Mar 19 17:24:36 crc kubenswrapper[4918]: I0319 17:24:36.488250 4918 generic.go:334] "Generic (PLEG): container finished" podID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" containerID="99d364127d20dc39b7218b65e1f2dd34a77fa48e13028063852978aa36e7262a" exitCode=0 Mar 19 17:24:36 crc kubenswrapper[4918]: I0319 17:24:36.488304 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz9hn" event={"ID":"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e","Type":"ContainerDied","Data":"99d364127d20dc39b7218b65e1f2dd34a77fa48e13028063852978aa36e7262a"} Mar 19 17:24:36 crc kubenswrapper[4918]: I0319 17:24:36.488805 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz9hn" event={"ID":"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e","Type":"ContainerStarted","Data":"b2687c1fe799e3239998367b09fee290d311d5071842d64f2ffa1f277635d8dd"} Mar 19 17:24:36 crc kubenswrapper[4918]: I0319 17:24:36.490939 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 17:24:38 crc kubenswrapper[4918]: I0319 17:24:38.525859 4918 generic.go:334] "Generic (PLEG): container finished" podID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" containerID="220da9a7219cdcd4e860f045541a58391b26fed5a0337dec304d1532a4057d84" exitCode=0 Mar 19 17:24:38 crc kubenswrapper[4918]: I0319 17:24:38.525950 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz9hn" event={"ID":"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e","Type":"ContainerDied","Data":"220da9a7219cdcd4e860f045541a58391b26fed5a0337dec304d1532a4057d84"} Mar 19 17:24:39 crc kubenswrapper[4918]: I0319 17:24:39.537768 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz9hn" event={"ID":"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e","Type":"ContainerStarted","Data":"b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c"} Mar 19 17:24:39 crc kubenswrapper[4918]: I0319 17:24:39.573000 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vz9hn" podStartSLOduration=3.030460068 podStartE2EDuration="5.572981446s" podCreationTimestamp="2026-03-19 17:24:34 +0000 UTC" firstStartedPulling="2026-03-19 17:24:36.490675317 +0000 UTC m=+2688.612874565" lastFinishedPulling="2026-03-19 17:24:39.033196665 +0000 UTC m=+2691.155395943" observedRunningTime="2026-03-19 17:24:39.565276916 +0000 UTC m=+2691.687476164" watchObservedRunningTime="2026-03-19 17:24:39.572981446 +0000 UTC m=+2691.695180694" Mar 19 17:24:45 crc kubenswrapper[4918]: I0319 17:24:45.277024 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:45 crc kubenswrapper[4918]: I0319 17:24:45.277933 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:45 crc kubenswrapper[4918]: I0319 17:24:45.349104 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:45 crc kubenswrapper[4918]: I0319 17:24:45.646731 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:45 crc kubenswrapper[4918]: I0319 17:24:45.697422 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vz9hn"] Mar 19 17:24:47 crc kubenswrapper[4918]: I0319 17:24:47.627871 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vz9hn" podUID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" containerName="registry-server" containerID="cri-o://b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c" gracePeriod=2 Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.074626 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.216317 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-catalog-content\") pod \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.216449 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-utilities\") pod \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.216559 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnnxx\" (UniqueName: \"kubernetes.io/projected/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-kube-api-access-dnnxx\") pod \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\" (UID: \"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e\") " Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.218596 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-utilities" (OuterVolumeSpecName: "utilities") pod "01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" (UID: "01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.223019 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-kube-api-access-dnnxx" (OuterVolumeSpecName: "kube-api-access-dnnxx") pod "01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" (UID: "01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e"). InnerVolumeSpecName "kube-api-access-dnnxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.319277 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.319482 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnnxx\" (UniqueName: \"kubernetes.io/projected/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-kube-api-access-dnnxx\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.363972 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" (UID: "01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.422443 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.649361 4918 generic.go:334] "Generic (PLEG): container finished" podID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" containerID="b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c" exitCode=0 Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.649402 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz9hn" event={"ID":"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e","Type":"ContainerDied","Data":"b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c"} Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.649432 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vz9hn" event={"ID":"01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e","Type":"ContainerDied","Data":"b2687c1fe799e3239998367b09fee290d311d5071842d64f2ffa1f277635d8dd"} Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.649449 4918 scope.go:117] "RemoveContainer" containerID="b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.649737 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vz9hn" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.686756 4918 scope.go:117] "RemoveContainer" containerID="220da9a7219cdcd4e860f045541a58391b26fed5a0337dec304d1532a4057d84" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.691158 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vz9hn"] Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.703244 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vz9hn"] Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.718007 4918 scope.go:117] "RemoveContainer" containerID="99d364127d20dc39b7218b65e1f2dd34a77fa48e13028063852978aa36e7262a" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.758051 4918 scope.go:117] "RemoveContainer" containerID="b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c" Mar 19 17:24:48 crc kubenswrapper[4918]: E0319 17:24:48.758621 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c\": container with ID starting with b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c not found: ID does not exist" containerID="b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.758664 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c"} err="failed to get container status \"b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c\": rpc error: code = NotFound desc = could not find container \"b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c\": container with ID starting with b046f53d776359870e94f04f95ca8e7072c6e042e2383e011971fe8596ac0e7c not found: ID does not exist" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.758692 4918 scope.go:117] "RemoveContainer" containerID="220da9a7219cdcd4e860f045541a58391b26fed5a0337dec304d1532a4057d84" Mar 19 17:24:48 crc kubenswrapper[4918]: E0319 17:24:48.759026 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220da9a7219cdcd4e860f045541a58391b26fed5a0337dec304d1532a4057d84\": container with ID starting with 220da9a7219cdcd4e860f045541a58391b26fed5a0337dec304d1532a4057d84 not found: ID does not exist" containerID="220da9a7219cdcd4e860f045541a58391b26fed5a0337dec304d1532a4057d84" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.759055 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220da9a7219cdcd4e860f045541a58391b26fed5a0337dec304d1532a4057d84"} err="failed to get container status \"220da9a7219cdcd4e860f045541a58391b26fed5a0337dec304d1532a4057d84\": rpc error: code = NotFound desc = could not find container \"220da9a7219cdcd4e860f045541a58391b26fed5a0337dec304d1532a4057d84\": container with ID starting with 220da9a7219cdcd4e860f045541a58391b26fed5a0337dec304d1532a4057d84 not found: ID does not exist" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.759076 4918 scope.go:117] "RemoveContainer" containerID="99d364127d20dc39b7218b65e1f2dd34a77fa48e13028063852978aa36e7262a" Mar 19 17:24:48 crc kubenswrapper[4918]: E0319 17:24:48.759489 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d364127d20dc39b7218b65e1f2dd34a77fa48e13028063852978aa36e7262a\": container with ID starting with 99d364127d20dc39b7218b65e1f2dd34a77fa48e13028063852978aa36e7262a not found: ID does not exist" containerID="99d364127d20dc39b7218b65e1f2dd34a77fa48e13028063852978aa36e7262a" Mar 19 17:24:48 crc kubenswrapper[4918]: I0319 17:24:48.759571 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d364127d20dc39b7218b65e1f2dd34a77fa48e13028063852978aa36e7262a"} err="failed to get container status \"99d364127d20dc39b7218b65e1f2dd34a77fa48e13028063852978aa36e7262a\": rpc error: code = NotFound desc = could not find container \"99d364127d20dc39b7218b65e1f2dd34a77fa48e13028063852978aa36e7262a\": container with ID starting with 99d364127d20dc39b7218b65e1f2dd34a77fa48e13028063852978aa36e7262a not found: ID does not exist" Mar 19 17:24:49 crc kubenswrapper[4918]: E0319 17:24:49.623153 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice/crio-b2687c1fe799e3239998367b09fee290d311d5071842d64f2ffa1f277635d8dd\": RecentStats: unable to find data in memory cache]" Mar 19 17:24:50 crc kubenswrapper[4918]: I0319 17:24:50.600286 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" path="/var/lib/kubelet/pods/01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e/volumes" Mar 19 17:24:58 crc kubenswrapper[4918]: I0319 17:24:58.212260 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:24:58 crc kubenswrapper[4918]: I0319 17:24:58.212886 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:24:59 crc kubenswrapper[4918]: E0319 17:24:59.893670 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice/crio-b2687c1fe799e3239998367b09fee290d311d5071842d64f2ffa1f277635d8dd\": RecentStats: unable to find data in memory cache]" Mar 19 17:25:05 crc kubenswrapper[4918]: I0319 17:25:05.800888 4918 scope.go:117] "RemoveContainer" containerID="a1f374cdfb67f4ca19a98e2ec4b66533a71d2c39dd3858c747501c78e0683b2e" Mar 19 17:25:10 crc kubenswrapper[4918]: E0319 17:25:10.148065 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice/crio-b2687c1fe799e3239998367b09fee290d311d5071842d64f2ffa1f277635d8dd\": RecentStats: unable to find data in memory cache]" Mar 19 17:25:20 crc kubenswrapper[4918]: E0319 17:25:20.417292 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice/crio-b2687c1fe799e3239998367b09fee290d311d5071842d64f2ffa1f277635d8dd\": RecentStats: unable to find data in memory cache]" Mar 19 17:25:28 crc kubenswrapper[4918]: I0319 17:25:28.211828 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:25:28 crc kubenswrapper[4918]: I0319 17:25:28.212459 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:25:30 crc kubenswrapper[4918]: E0319 17:25:30.695482 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice/crio-b2687c1fe799e3239998367b09fee290d311d5071842d64f2ffa1f277635d8dd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice\": RecentStats: unable to find data in memory cache]" Mar 19 17:25:40 crc kubenswrapper[4918]: E0319 17:25:40.978016 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bddafc_3f3a_4ab3_b1fb_80f9e4f4032e.slice/crio-b2687c1fe799e3239998367b09fee290d311d5071842d64f2ffa1f277635d8dd\": RecentStats: unable to find data in memory cache]" Mar 19 17:25:58 crc kubenswrapper[4918]: I0319 17:25:58.211786 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:25:58 crc kubenswrapper[4918]: I0319 17:25:58.212368 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:25:58 crc kubenswrapper[4918]: I0319 17:25:58.212424 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 17:25:58 crc kubenswrapper[4918]: I0319 17:25:58.213285 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c98bb004b2c85bfe02cdd8cceb684417f8e23df7690967e86e647f83c8c1c57f"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 17:25:58 crc kubenswrapper[4918]: I0319 17:25:58.213345 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://c98bb004b2c85bfe02cdd8cceb684417f8e23df7690967e86e647f83c8c1c57f" gracePeriod=600 Mar 19 17:25:58 crc kubenswrapper[4918]: I0319 17:25:58.387595 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="c98bb004b2c85bfe02cdd8cceb684417f8e23df7690967e86e647f83c8c1c57f" exitCode=0 Mar 19 17:25:58 crc kubenswrapper[4918]: I0319 17:25:58.387637 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"c98bb004b2c85bfe02cdd8cceb684417f8e23df7690967e86e647f83c8c1c57f"} Mar 19 17:25:58 crc kubenswrapper[4918]: I0319 17:25:58.387668 4918 scope.go:117] "RemoveContainer" containerID="21dd667e32ab340b63c9be4a12c88658c86b73fcad793da59709aed921e44a74" Mar 19 17:25:59 crc kubenswrapper[4918]: I0319 17:25:59.399833 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783"} Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.162163 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565686-d66sh"] Mar 19 17:26:00 crc kubenswrapper[4918]: E0319 17:26:00.162660 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" containerName="extract-content" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.162682 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" containerName="extract-content" Mar 19 17:26:00 crc kubenswrapper[4918]: E0319 17:26:00.162711 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" containerName="extract-utilities" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.162719 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" containerName="extract-utilities" Mar 19 17:26:00 crc kubenswrapper[4918]: E0319 17:26:00.162745 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" containerName="registry-server" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.162753 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" containerName="registry-server" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.163003 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="01bddafc-3f3a-4ab3-b1fb-80f9e4f4032e" containerName="registry-server" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.163881 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565686-d66sh" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.166051 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.166204 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.166865 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.178766 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565686-d66sh"] Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.267427 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gntqd\" (UniqueName: \"kubernetes.io/projected/28f6939f-8031-432c-8407-277c5d8ff9a8-kube-api-access-gntqd\") pod \"auto-csr-approver-29565686-d66sh\" (UID: \"28f6939f-8031-432c-8407-277c5d8ff9a8\") " pod="openshift-infra/auto-csr-approver-29565686-d66sh" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.370044 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gntqd\" (UniqueName: \"kubernetes.io/projected/28f6939f-8031-432c-8407-277c5d8ff9a8-kube-api-access-gntqd\") pod \"auto-csr-approver-29565686-d66sh\" (UID: \"28f6939f-8031-432c-8407-277c5d8ff9a8\") " pod="openshift-infra/auto-csr-approver-29565686-d66sh" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.408188 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gntqd\" (UniqueName: \"kubernetes.io/projected/28f6939f-8031-432c-8407-277c5d8ff9a8-kube-api-access-gntqd\") pod \"auto-csr-approver-29565686-d66sh\" (UID: \"28f6939f-8031-432c-8407-277c5d8ff9a8\") " pod="openshift-infra/auto-csr-approver-29565686-d66sh" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.502710 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565686-d66sh" Mar 19 17:26:00 crc kubenswrapper[4918]: I0319 17:26:00.984560 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565686-d66sh"] Mar 19 17:26:01 crc kubenswrapper[4918]: I0319 17:26:01.437740 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565686-d66sh" event={"ID":"28f6939f-8031-432c-8407-277c5d8ff9a8","Type":"ContainerStarted","Data":"4af10a5a7ef38db1927dd58994140d5cca25f7d616382f15a6360b662fb3642d"} Mar 19 17:26:02 crc kubenswrapper[4918]: I0319 17:26:02.449631 4918 generic.go:334] "Generic (PLEG): container finished" podID="28f6939f-8031-432c-8407-277c5d8ff9a8" containerID="cc97be739660cb40f7a2f1ea0e6e88d23f8949604149b0e30c9f6f7878a6addf" exitCode=0 Mar 19 17:26:02 crc kubenswrapper[4918]: I0319 17:26:02.449770 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565686-d66sh" event={"ID":"28f6939f-8031-432c-8407-277c5d8ff9a8","Type":"ContainerDied","Data":"cc97be739660cb40f7a2f1ea0e6e88d23f8949604149b0e30c9f6f7878a6addf"} Mar 19 17:26:03 crc kubenswrapper[4918]: I0319 17:26:03.881441 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565686-d66sh" Mar 19 17:26:03 crc kubenswrapper[4918]: I0319 17:26:03.898012 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gntqd\" (UniqueName: \"kubernetes.io/projected/28f6939f-8031-432c-8407-277c5d8ff9a8-kube-api-access-gntqd\") pod \"28f6939f-8031-432c-8407-277c5d8ff9a8\" (UID: \"28f6939f-8031-432c-8407-277c5d8ff9a8\") " Mar 19 17:26:03 crc kubenswrapper[4918]: I0319 17:26:03.907436 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f6939f-8031-432c-8407-277c5d8ff9a8-kube-api-access-gntqd" (OuterVolumeSpecName: "kube-api-access-gntqd") pod "28f6939f-8031-432c-8407-277c5d8ff9a8" (UID: "28f6939f-8031-432c-8407-277c5d8ff9a8"). InnerVolumeSpecName "kube-api-access-gntqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:26:04 crc kubenswrapper[4918]: I0319 17:26:04.000201 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gntqd\" (UniqueName: \"kubernetes.io/projected/28f6939f-8031-432c-8407-277c5d8ff9a8-kube-api-access-gntqd\") on node \"crc\" DevicePath \"\"" Mar 19 17:26:04 crc kubenswrapper[4918]: I0319 17:26:04.475222 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565686-d66sh" event={"ID":"28f6939f-8031-432c-8407-277c5d8ff9a8","Type":"ContainerDied","Data":"4af10a5a7ef38db1927dd58994140d5cca25f7d616382f15a6360b662fb3642d"} Mar 19 17:26:04 crc kubenswrapper[4918]: I0319 17:26:04.475620 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4af10a5a7ef38db1927dd58994140d5cca25f7d616382f15a6360b662fb3642d" Mar 19 17:26:04 crc kubenswrapper[4918]: I0319 17:26:04.475400 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565686-d66sh" Mar 19 17:26:04 crc kubenswrapper[4918]: I0319 17:26:04.968098 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565680-jffxr"] Mar 19 17:26:04 crc kubenswrapper[4918]: I0319 17:26:04.977694 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565680-jffxr"] Mar 19 17:26:06 crc kubenswrapper[4918]: I0319 17:26:06.608628 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="669c77dd-b97a-4886-8706-6cdeab3e29d6" path="/var/lib/kubelet/pods/669c77dd-b97a-4886-8706-6cdeab3e29d6/volumes" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.691431 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rqgjg"] Mar 19 17:26:12 crc kubenswrapper[4918]: E0319 17:26:12.694560 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f6939f-8031-432c-8407-277c5d8ff9a8" containerName="oc" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.694576 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f6939f-8031-432c-8407-277c5d8ff9a8" containerName="oc" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.694785 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f6939f-8031-432c-8407-277c5d8ff9a8" containerName="oc" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.696523 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.720277 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqgjg"] Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.795116 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-catalog-content\") pod \"certified-operators-rqgjg\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.795174 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd6b2\" (UniqueName: \"kubernetes.io/projected/822b6e34-51a2-44ff-a270-5199d2e052f7-kube-api-access-zd6b2\") pod \"certified-operators-rqgjg\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.795674 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-utilities\") pod \"certified-operators-rqgjg\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.898262 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-catalog-content\") pod \"certified-operators-rqgjg\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.898313 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd6b2\" (UniqueName: \"kubernetes.io/projected/822b6e34-51a2-44ff-a270-5199d2e052f7-kube-api-access-zd6b2\") pod \"certified-operators-rqgjg\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.898467 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-utilities\") pod \"certified-operators-rqgjg\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.898773 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-catalog-content\") pod \"certified-operators-rqgjg\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.899276 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-utilities\") pod \"certified-operators-rqgjg\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:12 crc kubenswrapper[4918]: I0319 17:26:12.920250 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd6b2\" (UniqueName: \"kubernetes.io/projected/822b6e34-51a2-44ff-a270-5199d2e052f7-kube-api-access-zd6b2\") pod \"certified-operators-rqgjg\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:13 crc kubenswrapper[4918]: I0319 17:26:13.038301 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:13 crc kubenswrapper[4918]: I0319 17:26:13.556163 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqgjg"] Mar 19 17:26:13 crc kubenswrapper[4918]: I0319 17:26:13.575412 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqgjg" event={"ID":"822b6e34-51a2-44ff-a270-5199d2e052f7","Type":"ContainerStarted","Data":"f1c06e233497e87b4d4ae8b7705bf97fbdab3fd665290a7b492abdcb64c66568"} Mar 19 17:26:14 crc kubenswrapper[4918]: I0319 17:26:14.591893 4918 generic.go:334] "Generic (PLEG): container finished" podID="822b6e34-51a2-44ff-a270-5199d2e052f7" containerID="fdb8eb9c0f79428416684ca23ae3947aaef4602a81e47cef59b86667ddbea494" exitCode=0 Mar 19 17:26:14 crc kubenswrapper[4918]: I0319 17:26:14.610373 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqgjg" event={"ID":"822b6e34-51a2-44ff-a270-5199d2e052f7","Type":"ContainerDied","Data":"fdb8eb9c0f79428416684ca23ae3947aaef4602a81e47cef59b86667ddbea494"} Mar 19 17:26:15 crc kubenswrapper[4918]: I0319 17:26:15.606652 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqgjg" event={"ID":"822b6e34-51a2-44ff-a270-5199d2e052f7","Type":"ContainerStarted","Data":"75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c"} Mar 19 17:26:17 crc kubenswrapper[4918]: I0319 17:26:17.636826 4918 generic.go:334] "Generic (PLEG): container finished" podID="822b6e34-51a2-44ff-a270-5199d2e052f7" containerID="75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c" exitCode=0 Mar 19 17:26:17 crc kubenswrapper[4918]: I0319 17:26:17.637015 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqgjg" event={"ID":"822b6e34-51a2-44ff-a270-5199d2e052f7","Type":"ContainerDied","Data":"75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c"} Mar 19 17:26:18 crc kubenswrapper[4918]: I0319 17:26:18.649254 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqgjg" event={"ID":"822b6e34-51a2-44ff-a270-5199d2e052f7","Type":"ContainerStarted","Data":"90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b"} Mar 19 17:26:18 crc kubenswrapper[4918]: I0319 17:26:18.669757 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rqgjg" podStartSLOduration=3.188705721 podStartE2EDuration="6.66973904s" podCreationTimestamp="2026-03-19 17:26:12 +0000 UTC" firstStartedPulling="2026-03-19 17:26:14.595803218 +0000 UTC m=+2786.718002486" lastFinishedPulling="2026-03-19 17:26:18.076836527 +0000 UTC m=+2790.199035805" observedRunningTime="2026-03-19 17:26:18.668195038 +0000 UTC m=+2790.790394286" watchObservedRunningTime="2026-03-19 17:26:18.66973904 +0000 UTC m=+2790.791938288" Mar 19 17:26:23 crc kubenswrapper[4918]: I0319 17:26:23.039630 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:23 crc kubenswrapper[4918]: I0319 17:26:23.040296 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:23 crc kubenswrapper[4918]: I0319 17:26:23.096274 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:23 crc kubenswrapper[4918]: I0319 17:26:23.781949 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:23 crc kubenswrapper[4918]: I0319 17:26:23.864325 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqgjg"] Mar 19 17:26:25 crc kubenswrapper[4918]: I0319 17:26:25.725230 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rqgjg" podUID="822b6e34-51a2-44ff-a270-5199d2e052f7" containerName="registry-server" containerID="cri-o://90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b" gracePeriod=2 Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.293436 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.397381 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd6b2\" (UniqueName: \"kubernetes.io/projected/822b6e34-51a2-44ff-a270-5199d2e052f7-kube-api-access-zd6b2\") pod \"822b6e34-51a2-44ff-a270-5199d2e052f7\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.397421 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-utilities\") pod \"822b6e34-51a2-44ff-a270-5199d2e052f7\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.397578 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-catalog-content\") pod \"822b6e34-51a2-44ff-a270-5199d2e052f7\" (UID: \"822b6e34-51a2-44ff-a270-5199d2e052f7\") " Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.398596 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-utilities" (OuterVolumeSpecName: "utilities") pod "822b6e34-51a2-44ff-a270-5199d2e052f7" (UID: "822b6e34-51a2-44ff-a270-5199d2e052f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.405902 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822b6e34-51a2-44ff-a270-5199d2e052f7-kube-api-access-zd6b2" (OuterVolumeSpecName: "kube-api-access-zd6b2") pod "822b6e34-51a2-44ff-a270-5199d2e052f7" (UID: "822b6e34-51a2-44ff-a270-5199d2e052f7"). InnerVolumeSpecName "kube-api-access-zd6b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.462139 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "822b6e34-51a2-44ff-a270-5199d2e052f7" (UID: "822b6e34-51a2-44ff-a270-5199d2e052f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.502125 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd6b2\" (UniqueName: \"kubernetes.io/projected/822b6e34-51a2-44ff-a270-5199d2e052f7-kube-api-access-zd6b2\") on node \"crc\" DevicePath \"\"" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.502181 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.502196 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822b6e34-51a2-44ff-a270-5199d2e052f7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.744171 4918 generic.go:334] "Generic (PLEG): container finished" podID="822b6e34-51a2-44ff-a270-5199d2e052f7" containerID="90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b" exitCode=0 Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.744209 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqgjg" event={"ID":"822b6e34-51a2-44ff-a270-5199d2e052f7","Type":"ContainerDied","Data":"90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b"} Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.744236 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqgjg" event={"ID":"822b6e34-51a2-44ff-a270-5199d2e052f7","Type":"ContainerDied","Data":"f1c06e233497e87b4d4ae8b7705bf97fbdab3fd665290a7b492abdcb64c66568"} Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.744252 4918 scope.go:117] "RemoveContainer" containerID="90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.744279 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqgjg" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.774125 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqgjg"] Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.779445 4918 scope.go:117] "RemoveContainer" containerID="75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.783659 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rqgjg"] Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.798681 4918 scope.go:117] "RemoveContainer" containerID="fdb8eb9c0f79428416684ca23ae3947aaef4602a81e47cef59b86667ddbea494" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.842382 4918 scope.go:117] "RemoveContainer" containerID="90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b" Mar 19 17:26:26 crc kubenswrapper[4918]: E0319 17:26:26.842997 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b\": container with ID starting with 90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b not found: ID does not exist" containerID="90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.843042 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b"} err="failed to get container status \"90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b\": rpc error: code = NotFound desc = could not find container \"90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b\": container with ID starting with 90e4d96f54b9dd67399e6f743f0108bd62f42803bdd0186d195f1d604fa5d13b not found: ID does not exist" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.843073 4918 scope.go:117] "RemoveContainer" containerID="75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c" Mar 19 17:26:26 crc kubenswrapper[4918]: E0319 17:26:26.843438 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c\": container with ID starting with 75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c not found: ID does not exist" containerID="75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.843487 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c"} err="failed to get container status \"75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c\": rpc error: code = NotFound desc = could not find container \"75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c\": container with ID starting with 75d217b894974a4afb44a9bc678f272c7d24570439e7b8a552564aca55cb020c not found: ID does not exist" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.843547 4918 scope.go:117] "RemoveContainer" containerID="fdb8eb9c0f79428416684ca23ae3947aaef4602a81e47cef59b86667ddbea494" Mar 19 17:26:26 crc kubenswrapper[4918]: E0319 17:26:26.843892 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb8eb9c0f79428416684ca23ae3947aaef4602a81e47cef59b86667ddbea494\": container with ID starting with fdb8eb9c0f79428416684ca23ae3947aaef4602a81e47cef59b86667ddbea494 not found: ID does not exist" containerID="fdb8eb9c0f79428416684ca23ae3947aaef4602a81e47cef59b86667ddbea494" Mar 19 17:26:26 crc kubenswrapper[4918]: I0319 17:26:26.843918 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb8eb9c0f79428416684ca23ae3947aaef4602a81e47cef59b86667ddbea494"} err="failed to get container status \"fdb8eb9c0f79428416684ca23ae3947aaef4602a81e47cef59b86667ddbea494\": rpc error: code = NotFound desc = could not find container \"fdb8eb9c0f79428416684ca23ae3947aaef4602a81e47cef59b86667ddbea494\": container with ID starting with fdb8eb9c0f79428416684ca23ae3947aaef4602a81e47cef59b86667ddbea494 not found: ID does not exist" Mar 19 17:26:28 crc kubenswrapper[4918]: I0319 17:26:28.610373 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="822b6e34-51a2-44ff-a270-5199d2e052f7" path="/var/lib/kubelet/pods/822b6e34-51a2-44ff-a270-5199d2e052f7/volumes" Mar 19 17:26:39 crc kubenswrapper[4918]: I0319 17:26:39.905448 4918 generic.go:334] "Generic (PLEG): container finished" podID="7aa831f4-171a-406e-b49f-eb422fb34edc" containerID="5b8378266f2e5e8df26ef6997ca07ab31092fa923997b956834c5413c20fdd22" exitCode=0 Mar 19 17:26:39 crc kubenswrapper[4918]: I0319 17:26:39.905603 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" event={"ID":"7aa831f4-171a-406e-b49f-eb422fb34edc","Type":"ContainerDied","Data":"5b8378266f2e5e8df26ef6997ca07ab31092fa923997b956834c5413c20fdd22"} Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.384971 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.470703 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-1\") pod \"7aa831f4-171a-406e-b49f-eb422fb34edc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.470797 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqgsj\" (UniqueName: \"kubernetes.io/projected/7aa831f4-171a-406e-b49f-eb422fb34edc-kube-api-access-nqgsj\") pod \"7aa831f4-171a-406e-b49f-eb422fb34edc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.470829 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ssh-key-openstack-edpm-ipam\") pod \"7aa831f4-171a-406e-b49f-eb422fb34edc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.470867 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-0\") pod \"7aa831f4-171a-406e-b49f-eb422fb34edc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.471478 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-2\") pod \"7aa831f4-171a-406e-b49f-eb422fb34edc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.471563 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-inventory\") pod \"7aa831f4-171a-406e-b49f-eb422fb34edc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.471687 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-telemetry-combined-ca-bundle\") pod \"7aa831f4-171a-406e-b49f-eb422fb34edc\" (UID: \"7aa831f4-171a-406e-b49f-eb422fb34edc\") " Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.476808 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7aa831f4-171a-406e-b49f-eb422fb34edc" (UID: "7aa831f4-171a-406e-b49f-eb422fb34edc"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.477217 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa831f4-171a-406e-b49f-eb422fb34edc-kube-api-access-nqgsj" (OuterVolumeSpecName: "kube-api-access-nqgsj") pod "7aa831f4-171a-406e-b49f-eb422fb34edc" (UID: "7aa831f4-171a-406e-b49f-eb422fb34edc"). InnerVolumeSpecName "kube-api-access-nqgsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.499940 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7aa831f4-171a-406e-b49f-eb422fb34edc" (UID: "7aa831f4-171a-406e-b49f-eb422fb34edc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.500335 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "7aa831f4-171a-406e-b49f-eb422fb34edc" (UID: "7aa831f4-171a-406e-b49f-eb422fb34edc"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.504957 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-inventory" (OuterVolumeSpecName: "inventory") pod "7aa831f4-171a-406e-b49f-eb422fb34edc" (UID: "7aa831f4-171a-406e-b49f-eb422fb34edc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.506636 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "7aa831f4-171a-406e-b49f-eb422fb34edc" (UID: "7aa831f4-171a-406e-b49f-eb422fb34edc"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.515214 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "7aa831f4-171a-406e-b49f-eb422fb34edc" (UID: "7aa831f4-171a-406e-b49f-eb422fb34edc"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.574031 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqgsj\" (UniqueName: \"kubernetes.io/projected/7aa831f4-171a-406e-b49f-eb422fb34edc-kube-api-access-nqgsj\") on node \"crc\" DevicePath \"\"" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.574067 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.574077 4918 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.574087 4918 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.574100 4918 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.574111 4918 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.574122 4918 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7aa831f4-171a-406e-b49f-eb422fb34edc-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.928803 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" event={"ID":"7aa831f4-171a-406e-b49f-eb422fb34edc","Type":"ContainerDied","Data":"621f9db7c010258cd34f30396fe94a3a01957de100684b7c7f2ca7cb39e5ed7f"} Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.928861 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="621f9db7c010258cd34f30396fe94a3a01957de100684b7c7f2ca7cb39e5ed7f" Mar 19 17:26:41 crc kubenswrapper[4918]: I0319 17:26:41.928882 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mpczc" Mar 19 17:27:05 crc kubenswrapper[4918]: I0319 17:27:05.948836 4918 scope.go:117] "RemoveContainer" containerID="89922f68d6efe6770f0c523005156cc36a073de43077f58c0c9987a8e0246a7b" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.153421 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565688-nwq7n"] Mar 19 17:28:00 crc kubenswrapper[4918]: E0319 17:28:00.154486 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822b6e34-51a2-44ff-a270-5199d2e052f7" containerName="extract-utilities" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.154503 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="822b6e34-51a2-44ff-a270-5199d2e052f7" containerName="extract-utilities" Mar 19 17:28:00 crc kubenswrapper[4918]: E0319 17:28:00.154571 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa831f4-171a-406e-b49f-eb422fb34edc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.154583 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa831f4-171a-406e-b49f-eb422fb34edc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 17:28:00 crc kubenswrapper[4918]: E0319 17:28:00.154606 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822b6e34-51a2-44ff-a270-5199d2e052f7" containerName="extract-content" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.154616 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="822b6e34-51a2-44ff-a270-5199d2e052f7" containerName="extract-content" Mar 19 17:28:00 crc kubenswrapper[4918]: E0319 17:28:00.154636 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822b6e34-51a2-44ff-a270-5199d2e052f7" containerName="registry-server" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.154643 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="822b6e34-51a2-44ff-a270-5199d2e052f7" containerName="registry-server" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.154871 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa831f4-171a-406e-b49f-eb422fb34edc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.154906 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="822b6e34-51a2-44ff-a270-5199d2e052f7" containerName="registry-server" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.155907 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565688-nwq7n" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.160686 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.161167 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.161501 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.169859 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565688-nwq7n"] Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.259394 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p6x4\" (UniqueName: \"kubernetes.io/projected/6a44fd43-7d87-4a43-af64-a1395c883e33-kube-api-access-4p6x4\") pod \"auto-csr-approver-29565688-nwq7n\" (UID: \"6a44fd43-7d87-4a43-af64-a1395c883e33\") " pod="openshift-infra/auto-csr-approver-29565688-nwq7n" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.361659 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p6x4\" (UniqueName: \"kubernetes.io/projected/6a44fd43-7d87-4a43-af64-a1395c883e33-kube-api-access-4p6x4\") pod \"auto-csr-approver-29565688-nwq7n\" (UID: \"6a44fd43-7d87-4a43-af64-a1395c883e33\") " pod="openshift-infra/auto-csr-approver-29565688-nwq7n" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.395317 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p6x4\" (UniqueName: \"kubernetes.io/projected/6a44fd43-7d87-4a43-af64-a1395c883e33-kube-api-access-4p6x4\") pod \"auto-csr-approver-29565688-nwq7n\" (UID: \"6a44fd43-7d87-4a43-af64-a1395c883e33\") " pod="openshift-infra/auto-csr-approver-29565688-nwq7n" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.474399 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565688-nwq7n" Mar 19 17:28:00 crc kubenswrapper[4918]: I0319 17:28:00.972699 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565688-nwq7n"] Mar 19 17:28:01 crc kubenswrapper[4918]: I0319 17:28:01.053820 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565688-nwq7n" event={"ID":"6a44fd43-7d87-4a43-af64-a1395c883e33","Type":"ContainerStarted","Data":"f00b85648e8618701632559a5d55055332778e0891bdffe0883ce013d9eb28e2"} Mar 19 17:28:06 crc kubenswrapper[4918]: I0319 17:28:06.114818 4918 generic.go:334] "Generic (PLEG): container finished" podID="6a44fd43-7d87-4a43-af64-a1395c883e33" containerID="1c545dd37513c59a679492fcafdb48210519e043e6f091078003aa42d906f76e" exitCode=0 Mar 19 17:28:06 crc kubenswrapper[4918]: I0319 17:28:06.114923 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565688-nwq7n" event={"ID":"6a44fd43-7d87-4a43-af64-a1395c883e33","Type":"ContainerDied","Data":"1c545dd37513c59a679492fcafdb48210519e043e6f091078003aa42d906f76e"} Mar 19 17:28:07 crc kubenswrapper[4918]: I0319 17:28:07.528241 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565688-nwq7n" Mar 19 17:28:07 crc kubenswrapper[4918]: I0319 17:28:07.531941 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p6x4\" (UniqueName: \"kubernetes.io/projected/6a44fd43-7d87-4a43-af64-a1395c883e33-kube-api-access-4p6x4\") pod \"6a44fd43-7d87-4a43-af64-a1395c883e33\" (UID: \"6a44fd43-7d87-4a43-af64-a1395c883e33\") " Mar 19 17:28:07 crc kubenswrapper[4918]: I0319 17:28:07.538488 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a44fd43-7d87-4a43-af64-a1395c883e33-kube-api-access-4p6x4" (OuterVolumeSpecName: "kube-api-access-4p6x4") pod "6a44fd43-7d87-4a43-af64-a1395c883e33" (UID: "6a44fd43-7d87-4a43-af64-a1395c883e33"). InnerVolumeSpecName "kube-api-access-4p6x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:28:07 crc kubenswrapper[4918]: I0319 17:28:07.634850 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p6x4\" (UniqueName: \"kubernetes.io/projected/6a44fd43-7d87-4a43-af64-a1395c883e33-kube-api-access-4p6x4\") on node \"crc\" DevicePath \"\"" Mar 19 17:28:08 crc kubenswrapper[4918]: I0319 17:28:08.137997 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565688-nwq7n" event={"ID":"6a44fd43-7d87-4a43-af64-a1395c883e33","Type":"ContainerDied","Data":"f00b85648e8618701632559a5d55055332778e0891bdffe0883ce013d9eb28e2"} Mar 19 17:28:08 crc kubenswrapper[4918]: I0319 17:28:08.138246 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f00b85648e8618701632559a5d55055332778e0891bdffe0883ce013d9eb28e2" Mar 19 17:28:08 crc kubenswrapper[4918]: I0319 17:28:08.138111 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565688-nwq7n" Mar 19 17:28:08 crc kubenswrapper[4918]: I0319 17:28:08.629148 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565682-mv97p"] Mar 19 17:28:08 crc kubenswrapper[4918]: I0319 17:28:08.637305 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565682-mv97p"] Mar 19 17:28:10 crc kubenswrapper[4918]: I0319 17:28:10.602183 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe05a8c7-498f-468f-9b97-535017a6c66c" path="/var/lib/kubelet/pods/fe05a8c7-498f-468f-9b97-535017a6c66c/volumes" Mar 19 17:28:28 crc kubenswrapper[4918]: I0319 17:28:28.212347 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:28:28 crc kubenswrapper[4918]: I0319 17:28:28.213816 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:28:58 crc kubenswrapper[4918]: I0319 17:28:58.211799 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:28:58 crc kubenswrapper[4918]: I0319 17:28:58.212419 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:29:06 crc kubenswrapper[4918]: I0319 17:29:06.092314 4918 scope.go:117] "RemoveContainer" containerID="2faf5aff0f250f6cef4aad2075ef1c275b08984a26fd3ae204422a23708c346f" Mar 19 17:29:28 crc kubenswrapper[4918]: I0319 17:29:28.211892 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:29:28 crc kubenswrapper[4918]: I0319 17:29:28.212595 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:29:28 crc kubenswrapper[4918]: I0319 17:29:28.212662 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 17:29:28 crc kubenswrapper[4918]: I0319 17:29:28.213630 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 17:29:28 crc kubenswrapper[4918]: I0319 17:29:28.213705 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" gracePeriod=600 Mar 19 17:29:28 crc kubenswrapper[4918]: E0319 17:29:28.346640 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:29:29 crc kubenswrapper[4918]: I0319 17:29:29.028111 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" exitCode=0 Mar 19 17:29:29 crc kubenswrapper[4918]: I0319 17:29:29.028156 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783"} Mar 19 17:29:29 crc kubenswrapper[4918]: I0319 17:29:29.028195 4918 scope.go:117] "RemoveContainer" containerID="c98bb004b2c85bfe02cdd8cceb684417f8e23df7690967e86e647f83c8c1c57f" Mar 19 17:29:29 crc kubenswrapper[4918]: I0319 17:29:29.028884 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:29:29 crc kubenswrapper[4918]: E0319 17:29:29.029140 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:29:42 crc kubenswrapper[4918]: I0319 17:29:42.586442 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:29:42 crc kubenswrapper[4918]: E0319 17:29:42.587150 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:29:53 crc kubenswrapper[4918]: I0319 17:29:53.588336 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:29:53 crc kubenswrapper[4918]: E0319 17:29:53.589032 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.177033 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565690-2w68q"] Mar 19 17:30:00 crc kubenswrapper[4918]: E0319 17:30:00.178486 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a44fd43-7d87-4a43-af64-a1395c883e33" containerName="oc" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.178507 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a44fd43-7d87-4a43-af64-a1395c883e33" containerName="oc" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.178938 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a44fd43-7d87-4a43-af64-a1395c883e33" containerName="oc" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.180216 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565690-2w68q" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.183005 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.183694 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.183792 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.190761 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght"] Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.193838 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.196393 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.197125 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.206124 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565690-2w68q"] Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.226615 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght"] Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.308419 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3ea0181-35f4-4f23-a3cd-0e06013541b8-secret-volume\") pod \"collect-profiles-29565690-z6ght\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.308703 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2kwc\" (UniqueName: \"kubernetes.io/projected/d626d9fc-ecbb-46ff-93ec-39cc893260a7-kube-api-access-d2kwc\") pod \"auto-csr-approver-29565690-2w68q\" (UID: \"d626d9fc-ecbb-46ff-93ec-39cc893260a7\") " pod="openshift-infra/auto-csr-approver-29565690-2w68q" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.308829 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3ea0181-35f4-4f23-a3cd-0e06013541b8-config-volume\") pod \"collect-profiles-29565690-z6ght\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.308921 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzx8g\" (UniqueName: \"kubernetes.io/projected/d3ea0181-35f4-4f23-a3cd-0e06013541b8-kube-api-access-rzx8g\") pod \"collect-profiles-29565690-z6ght\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.411192 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3ea0181-35f4-4f23-a3cd-0e06013541b8-secret-volume\") pod \"collect-profiles-29565690-z6ght\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.411546 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2kwc\" (UniqueName: \"kubernetes.io/projected/d626d9fc-ecbb-46ff-93ec-39cc893260a7-kube-api-access-d2kwc\") pod \"auto-csr-approver-29565690-2w68q\" (UID: \"d626d9fc-ecbb-46ff-93ec-39cc893260a7\") " pod="openshift-infra/auto-csr-approver-29565690-2w68q" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.411652 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3ea0181-35f4-4f23-a3cd-0e06013541b8-config-volume\") pod \"collect-profiles-29565690-z6ght\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.411677 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzx8g\" (UniqueName: \"kubernetes.io/projected/d3ea0181-35f4-4f23-a3cd-0e06013541b8-kube-api-access-rzx8g\") pod \"collect-profiles-29565690-z6ght\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.412965 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3ea0181-35f4-4f23-a3cd-0e06013541b8-config-volume\") pod \"collect-profiles-29565690-z6ght\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.418191 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3ea0181-35f4-4f23-a3cd-0e06013541b8-secret-volume\") pod \"collect-profiles-29565690-z6ght\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.428413 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzx8g\" (UniqueName: \"kubernetes.io/projected/d3ea0181-35f4-4f23-a3cd-0e06013541b8-kube-api-access-rzx8g\") pod \"collect-profiles-29565690-z6ght\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.428657 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2kwc\" (UniqueName: \"kubernetes.io/projected/d626d9fc-ecbb-46ff-93ec-39cc893260a7-kube-api-access-d2kwc\") pod \"auto-csr-approver-29565690-2w68q\" (UID: \"d626d9fc-ecbb-46ff-93ec-39cc893260a7\") " pod="openshift-infra/auto-csr-approver-29565690-2w68q" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.516211 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565690-2w68q" Mar 19 17:30:00 crc kubenswrapper[4918]: I0319 17:30:00.527557 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:01 crc kubenswrapper[4918]: I0319 17:30:01.070238 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565690-2w68q"] Mar 19 17:30:01 crc kubenswrapper[4918]: I0319 17:30:01.081802 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght"] Mar 19 17:30:01 crc kubenswrapper[4918]: I0319 17:30:01.083757 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 17:30:01 crc kubenswrapper[4918]: I0319 17:30:01.382002 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" event={"ID":"d3ea0181-35f4-4f23-a3cd-0e06013541b8","Type":"ContainerStarted","Data":"54b39f4c468d2f3d6e22742c2414e446a575146c1d8ce0be316dd3e4ba69a534"} Mar 19 17:30:01 crc kubenswrapper[4918]: I0319 17:30:01.383212 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" event={"ID":"d3ea0181-35f4-4f23-a3cd-0e06013541b8","Type":"ContainerStarted","Data":"46fead982bf24926151eb6daf37275ab05414bf9b2ccb2ae47e363d3cb74c06e"} Mar 19 17:30:01 crc kubenswrapper[4918]: I0319 17:30:01.384009 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565690-2w68q" event={"ID":"d626d9fc-ecbb-46ff-93ec-39cc893260a7","Type":"ContainerStarted","Data":"003348c88f8995a739d091664a812c67005502c1d4bf39144bfa9b29ebb388fd"} Mar 19 17:30:01 crc kubenswrapper[4918]: I0319 17:30:01.402888 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" podStartSLOduration=1.402866903 podStartE2EDuration="1.402866903s" podCreationTimestamp="2026-03-19 17:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:30:01.398674538 +0000 UTC m=+3013.520873786" watchObservedRunningTime="2026-03-19 17:30:01.402866903 +0000 UTC m=+3013.525066151" Mar 19 17:30:02 crc kubenswrapper[4918]: I0319 17:30:02.395559 4918 generic.go:334] "Generic (PLEG): container finished" podID="d3ea0181-35f4-4f23-a3cd-0e06013541b8" containerID="54b39f4c468d2f3d6e22742c2414e446a575146c1d8ce0be316dd3e4ba69a534" exitCode=0 Mar 19 17:30:02 crc kubenswrapper[4918]: I0319 17:30:02.395652 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" event={"ID":"d3ea0181-35f4-4f23-a3cd-0e06013541b8","Type":"ContainerDied","Data":"54b39f4c468d2f3d6e22742c2414e446a575146c1d8ce0be316dd3e4ba69a534"} Mar 19 17:30:03 crc kubenswrapper[4918]: I0319 17:30:03.406423 4918 generic.go:334] "Generic (PLEG): container finished" podID="d626d9fc-ecbb-46ff-93ec-39cc893260a7" containerID="a8def4f6d570b60f5b868869a17a9ad8e1b9ddf330c3f171ee7c92b1c9e7445e" exitCode=0 Mar 19 17:30:03 crc kubenswrapper[4918]: I0319 17:30:03.406502 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565690-2w68q" event={"ID":"d626d9fc-ecbb-46ff-93ec-39cc893260a7","Type":"ContainerDied","Data":"a8def4f6d570b60f5b868869a17a9ad8e1b9ddf330c3f171ee7c92b1c9e7445e"} Mar 19 17:30:03 crc kubenswrapper[4918]: I0319 17:30:03.817167 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:03 crc kubenswrapper[4918]: I0319 17:30:03.991232 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3ea0181-35f4-4f23-a3cd-0e06013541b8-config-volume\") pod \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " Mar 19 17:30:03 crc kubenswrapper[4918]: I0319 17:30:03.991461 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3ea0181-35f4-4f23-a3cd-0e06013541b8-secret-volume\") pod \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " Mar 19 17:30:03 crc kubenswrapper[4918]: I0319 17:30:03.991881 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzx8g\" (UniqueName: \"kubernetes.io/projected/d3ea0181-35f4-4f23-a3cd-0e06013541b8-kube-api-access-rzx8g\") pod \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\" (UID: \"d3ea0181-35f4-4f23-a3cd-0e06013541b8\") " Mar 19 17:30:03 crc kubenswrapper[4918]: I0319 17:30:03.992554 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ea0181-35f4-4f23-a3cd-0e06013541b8-config-volume" (OuterVolumeSpecName: "config-volume") pod "d3ea0181-35f4-4f23-a3cd-0e06013541b8" (UID: "d3ea0181-35f4-4f23-a3cd-0e06013541b8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:30:03 crc kubenswrapper[4918]: I0319 17:30:03.993495 4918 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3ea0181-35f4-4f23-a3cd-0e06013541b8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 17:30:03 crc kubenswrapper[4918]: I0319 17:30:03.998335 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ea0181-35f4-4f23-a3cd-0e06013541b8-kube-api-access-rzx8g" (OuterVolumeSpecName: "kube-api-access-rzx8g") pod "d3ea0181-35f4-4f23-a3cd-0e06013541b8" (UID: "d3ea0181-35f4-4f23-a3cd-0e06013541b8"). InnerVolumeSpecName "kube-api-access-rzx8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:30:03 crc kubenswrapper[4918]: I0319 17:30:03.998902 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ea0181-35f4-4f23-a3cd-0e06013541b8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d3ea0181-35f4-4f23-a3cd-0e06013541b8" (UID: "d3ea0181-35f4-4f23-a3cd-0e06013541b8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:30:04 crc kubenswrapper[4918]: I0319 17:30:04.096138 4918 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d3ea0181-35f4-4f23-a3cd-0e06013541b8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 17:30:04 crc kubenswrapper[4918]: I0319 17:30:04.096208 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzx8g\" (UniqueName: \"kubernetes.io/projected/d3ea0181-35f4-4f23-a3cd-0e06013541b8-kube-api-access-rzx8g\") on node \"crc\" DevicePath \"\"" Mar 19 17:30:04 crc kubenswrapper[4918]: I0319 17:30:04.429692 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" Mar 19 17:30:04 crc kubenswrapper[4918]: I0319 17:30:04.429809 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght" event={"ID":"d3ea0181-35f4-4f23-a3cd-0e06013541b8","Type":"ContainerDied","Data":"46fead982bf24926151eb6daf37275ab05414bf9b2ccb2ae47e363d3cb74c06e"} Mar 19 17:30:04 crc kubenswrapper[4918]: I0319 17:30:04.430145 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46fead982bf24926151eb6daf37275ab05414bf9b2ccb2ae47e363d3cb74c06e" Mar 19 17:30:04 crc kubenswrapper[4918]: I0319 17:30:04.477131 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh"] Mar 19 17:30:04 crc kubenswrapper[4918]: I0319 17:30:04.485770 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565645-glllh"] Mar 19 17:30:04 crc kubenswrapper[4918]: I0319 17:30:04.602350 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1df102a-9829-4f44-a21e-f9c72d0bd2e8" path="/var/lib/kubelet/pods/f1df102a-9829-4f44-a21e-f9c72d0bd2e8/volumes" Mar 19 17:30:04 crc kubenswrapper[4918]: I0319 17:30:04.806448 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565690-2w68q" Mar 19 17:30:04 crc kubenswrapper[4918]: I0319 17:30:04.936624 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2kwc\" (UniqueName: \"kubernetes.io/projected/d626d9fc-ecbb-46ff-93ec-39cc893260a7-kube-api-access-d2kwc\") pod \"d626d9fc-ecbb-46ff-93ec-39cc893260a7\" (UID: \"d626d9fc-ecbb-46ff-93ec-39cc893260a7\") " Mar 19 17:30:04 crc kubenswrapper[4918]: I0319 17:30:04.941277 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d626d9fc-ecbb-46ff-93ec-39cc893260a7-kube-api-access-d2kwc" (OuterVolumeSpecName: "kube-api-access-d2kwc") pod "d626d9fc-ecbb-46ff-93ec-39cc893260a7" (UID: "d626d9fc-ecbb-46ff-93ec-39cc893260a7"). InnerVolumeSpecName "kube-api-access-d2kwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:30:05 crc kubenswrapper[4918]: I0319 17:30:05.039338 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2kwc\" (UniqueName: \"kubernetes.io/projected/d626d9fc-ecbb-46ff-93ec-39cc893260a7-kube-api-access-d2kwc\") on node \"crc\" DevicePath \"\"" Mar 19 17:30:05 crc kubenswrapper[4918]: I0319 17:30:05.441279 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565690-2w68q" event={"ID":"d626d9fc-ecbb-46ff-93ec-39cc893260a7","Type":"ContainerDied","Data":"003348c88f8995a739d091664a812c67005502c1d4bf39144bfa9b29ebb388fd"} Mar 19 17:30:05 crc kubenswrapper[4918]: I0319 17:30:05.441319 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003348c88f8995a739d091664a812c67005502c1d4bf39144bfa9b29ebb388fd" Mar 19 17:30:05 crc kubenswrapper[4918]: I0319 17:30:05.441349 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565690-2w68q" Mar 19 17:30:05 crc kubenswrapper[4918]: I0319 17:30:05.587321 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:30:05 crc kubenswrapper[4918]: E0319 17:30:05.587600 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:30:05 crc kubenswrapper[4918]: I0319 17:30:05.901070 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565684-fm8j8"] Mar 19 17:30:05 crc kubenswrapper[4918]: I0319 17:30:05.912885 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565684-fm8j8"] Mar 19 17:30:06 crc kubenswrapper[4918]: I0319 17:30:06.206306 4918 scope.go:117] "RemoveContainer" containerID="e9fb9e353348d5b1141e3d2aa47fa1c4daf0cb6a33015542386b6f73fd5c7430" Mar 19 17:30:06 crc kubenswrapper[4918]: I0319 17:30:06.242586 4918 scope.go:117] "RemoveContainer" containerID="9e2b74237437c105f98ace1fac6a1e1b5d32669cf19de648d02bda86c1409e42" Mar 19 17:30:06 crc kubenswrapper[4918]: I0319 17:30:06.322473 4918 scope.go:117] "RemoveContainer" containerID="088d0c4bda1be07221b1a654e10a4be02d14a7439b3088a6d58ab1635f7bcf26" Mar 19 17:30:06 crc kubenswrapper[4918]: I0319 17:30:06.363273 4918 scope.go:117] "RemoveContainer" containerID="0e631bb0ab67025433aecc13c1b9becb3d0c49f67f2aaf8ad6562573b0114996" Mar 19 17:30:06 crc kubenswrapper[4918]: I0319 17:30:06.602927 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="083dfc61-7262-4798-8446-6539ad70e6c7" path="/var/lib/kubelet/pods/083dfc61-7262-4798-8446-6539ad70e6c7/volumes" Mar 19 17:30:18 crc kubenswrapper[4918]: I0319 17:30:18.593644 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:30:18 crc kubenswrapper[4918]: E0319 17:30:18.594551 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:30:33 crc kubenswrapper[4918]: I0319 17:30:33.586804 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:30:33 crc kubenswrapper[4918]: E0319 17:30:33.587637 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.698954 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kmv72"] Mar 19 17:30:38 crc kubenswrapper[4918]: E0319 17:30:38.700064 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d626d9fc-ecbb-46ff-93ec-39cc893260a7" containerName="oc" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.700086 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d626d9fc-ecbb-46ff-93ec-39cc893260a7" containerName="oc" Mar 19 17:30:38 crc kubenswrapper[4918]: E0319 17:30:38.700111 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ea0181-35f4-4f23-a3cd-0e06013541b8" containerName="collect-profiles" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.700122 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ea0181-35f4-4f23-a3cd-0e06013541b8" containerName="collect-profiles" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.700429 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ea0181-35f4-4f23-a3cd-0e06013541b8" containerName="collect-profiles" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.700461 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d626d9fc-ecbb-46ff-93ec-39cc893260a7" containerName="oc" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.702732 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.712690 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kmv72"] Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.874772 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhjv4\" (UniqueName: \"kubernetes.io/projected/6148e943-dd16-4749-b864-4cf2213812b7-kube-api-access-mhjv4\") pod \"redhat-operators-kmv72\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.874830 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-catalog-content\") pod \"redhat-operators-kmv72\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.875048 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-utilities\") pod \"redhat-operators-kmv72\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.976573 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-utilities\") pod \"redhat-operators-kmv72\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.976977 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjv4\" (UniqueName: \"kubernetes.io/projected/6148e943-dd16-4749-b864-4cf2213812b7-kube-api-access-mhjv4\") pod \"redhat-operators-kmv72\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.977112 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-catalog-content\") pod \"redhat-operators-kmv72\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.977138 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-utilities\") pod \"redhat-operators-kmv72\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:38 crc kubenswrapper[4918]: I0319 17:30:38.977582 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-catalog-content\") pod \"redhat-operators-kmv72\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:39 crc kubenswrapper[4918]: I0319 17:30:39.000492 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhjv4\" (UniqueName: \"kubernetes.io/projected/6148e943-dd16-4749-b864-4cf2213812b7-kube-api-access-mhjv4\") pod \"redhat-operators-kmv72\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:39 crc kubenswrapper[4918]: I0319 17:30:39.028384 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:39 crc kubenswrapper[4918]: I0319 17:30:39.628192 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kmv72"] Mar 19 17:30:39 crc kubenswrapper[4918]: I0319 17:30:39.833140 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmv72" event={"ID":"6148e943-dd16-4749-b864-4cf2213812b7","Type":"ContainerStarted","Data":"7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6"} Mar 19 17:30:39 crc kubenswrapper[4918]: I0319 17:30:39.833418 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmv72" event={"ID":"6148e943-dd16-4749-b864-4cf2213812b7","Type":"ContainerStarted","Data":"089827311bcd4d14eb04d62cebd3ac909f025c070f775c0dee7b9a2c863ede5a"} Mar 19 17:30:40 crc kubenswrapper[4918]: I0319 17:30:40.845027 4918 generic.go:334] "Generic (PLEG): container finished" podID="6148e943-dd16-4749-b864-4cf2213812b7" containerID="7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6" exitCode=0 Mar 19 17:30:40 crc kubenswrapper[4918]: I0319 17:30:40.845352 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmv72" event={"ID":"6148e943-dd16-4749-b864-4cf2213812b7","Type":"ContainerDied","Data":"7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6"} Mar 19 17:30:41 crc kubenswrapper[4918]: I0319 17:30:41.862057 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmv72" event={"ID":"6148e943-dd16-4749-b864-4cf2213812b7","Type":"ContainerStarted","Data":"0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b"} Mar 19 17:30:44 crc kubenswrapper[4918]: I0319 17:30:44.908653 4918 generic.go:334] "Generic (PLEG): container finished" podID="6148e943-dd16-4749-b864-4cf2213812b7" containerID="0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b" exitCode=0 Mar 19 17:30:44 crc kubenswrapper[4918]: I0319 17:30:44.908814 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmv72" event={"ID":"6148e943-dd16-4749-b864-4cf2213812b7","Type":"ContainerDied","Data":"0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b"} Mar 19 17:30:45 crc kubenswrapper[4918]: I0319 17:30:45.920504 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmv72" event={"ID":"6148e943-dd16-4749-b864-4cf2213812b7","Type":"ContainerStarted","Data":"9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d"} Mar 19 17:30:45 crc kubenswrapper[4918]: I0319 17:30:45.949430 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kmv72" podStartSLOduration=3.388594826 podStartE2EDuration="7.949403937s" podCreationTimestamp="2026-03-19 17:30:38 +0000 UTC" firstStartedPulling="2026-03-19 17:30:40.847677962 +0000 UTC m=+3052.969877210" lastFinishedPulling="2026-03-19 17:30:45.408487063 +0000 UTC m=+3057.530686321" observedRunningTime="2026-03-19 17:30:45.940878823 +0000 UTC m=+3058.063078101" watchObservedRunningTime="2026-03-19 17:30:45.949403937 +0000 UTC m=+3058.071603205" Mar 19 17:30:48 crc kubenswrapper[4918]: I0319 17:30:48.594933 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:30:48 crc kubenswrapper[4918]: E0319 17:30:48.595874 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:30:49 crc kubenswrapper[4918]: I0319 17:30:49.030081 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:49 crc kubenswrapper[4918]: I0319 17:30:49.030130 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:50 crc kubenswrapper[4918]: I0319 17:30:50.092655 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kmv72" podUID="6148e943-dd16-4749-b864-4cf2213812b7" containerName="registry-server" probeResult="failure" output=< Mar 19 17:30:50 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 17:30:50 crc kubenswrapper[4918]: > Mar 19 17:30:59 crc kubenswrapper[4918]: I0319 17:30:59.093758 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:59 crc kubenswrapper[4918]: I0319 17:30:59.182571 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:30:59 crc kubenswrapper[4918]: I0319 17:30:59.345899 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kmv72"] Mar 19 17:30:59 crc kubenswrapper[4918]: I0319 17:30:59.586723 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:30:59 crc kubenswrapper[4918]: E0319 17:30:59.587283 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:31:01 crc kubenswrapper[4918]: I0319 17:31:01.105304 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kmv72" podUID="6148e943-dd16-4749-b864-4cf2213812b7" containerName="registry-server" containerID="cri-o://9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d" gracePeriod=2 Mar 19 17:31:01 crc kubenswrapper[4918]: I0319 17:31:01.593166 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:31:01 crc kubenswrapper[4918]: I0319 17:31:01.694102 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhjv4\" (UniqueName: \"kubernetes.io/projected/6148e943-dd16-4749-b864-4cf2213812b7-kube-api-access-mhjv4\") pod \"6148e943-dd16-4749-b864-4cf2213812b7\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " Mar 19 17:31:01 crc kubenswrapper[4918]: I0319 17:31:01.694358 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-utilities\") pod \"6148e943-dd16-4749-b864-4cf2213812b7\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " Mar 19 17:31:01 crc kubenswrapper[4918]: I0319 17:31:01.694385 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-catalog-content\") pod \"6148e943-dd16-4749-b864-4cf2213812b7\" (UID: \"6148e943-dd16-4749-b864-4cf2213812b7\") " Mar 19 17:31:01 crc kubenswrapper[4918]: I0319 17:31:01.695286 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-utilities" (OuterVolumeSpecName: "utilities") pod "6148e943-dd16-4749-b864-4cf2213812b7" (UID: "6148e943-dd16-4749-b864-4cf2213812b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:31:01 crc kubenswrapper[4918]: I0319 17:31:01.699839 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6148e943-dd16-4749-b864-4cf2213812b7-kube-api-access-mhjv4" (OuterVolumeSpecName: "kube-api-access-mhjv4") pod "6148e943-dd16-4749-b864-4cf2213812b7" (UID: "6148e943-dd16-4749-b864-4cf2213812b7"). InnerVolumeSpecName "kube-api-access-mhjv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:31:01 crc kubenswrapper[4918]: I0319 17:31:01.796659 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhjv4\" (UniqueName: \"kubernetes.io/projected/6148e943-dd16-4749-b864-4cf2213812b7-kube-api-access-mhjv4\") on node \"crc\" DevicePath \"\"" Mar 19 17:31:01 crc kubenswrapper[4918]: I0319 17:31:01.796725 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:31:01 crc kubenswrapper[4918]: I0319 17:31:01.846282 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6148e943-dd16-4749-b864-4cf2213812b7" (UID: "6148e943-dd16-4749-b864-4cf2213812b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:31:01 crc kubenswrapper[4918]: I0319 17:31:01.898705 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6148e943-dd16-4749-b864-4cf2213812b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.114691 4918 generic.go:334] "Generic (PLEG): container finished" podID="6148e943-dd16-4749-b864-4cf2213812b7" containerID="9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d" exitCode=0 Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.114733 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmv72" event={"ID":"6148e943-dd16-4749-b864-4cf2213812b7","Type":"ContainerDied","Data":"9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d"} Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.114761 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmv72" event={"ID":"6148e943-dd16-4749-b864-4cf2213812b7","Type":"ContainerDied","Data":"089827311bcd4d14eb04d62cebd3ac909f025c070f775c0dee7b9a2c863ede5a"} Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.114779 4918 scope.go:117] "RemoveContainer" containerID="9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d" Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.114919 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmv72" Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.153641 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kmv72"] Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.154307 4918 scope.go:117] "RemoveContainer" containerID="0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b" Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.163197 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kmv72"] Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.176075 4918 scope.go:117] "RemoveContainer" containerID="7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6" Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.220319 4918 scope.go:117] "RemoveContainer" containerID="9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d" Mar 19 17:31:02 crc kubenswrapper[4918]: E0319 17:31:02.220811 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d\": container with ID starting with 9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d not found: ID does not exist" containerID="9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d" Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.220865 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d"} err="failed to get container status \"9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d\": rpc error: code = NotFound desc = could not find container \"9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d\": container with ID starting with 9f8399bdb15a7b3869924b50c4454163327c475e362eb7d6ebe92090b16ed51d not found: ID does not exist" Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.220896 4918 scope.go:117] "RemoveContainer" containerID="0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b" Mar 19 17:31:02 crc kubenswrapper[4918]: E0319 17:31:02.221315 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b\": container with ID starting with 0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b not found: ID does not exist" containerID="0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b" Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.221343 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b"} err="failed to get container status \"0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b\": rpc error: code = NotFound desc = could not find container \"0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b\": container with ID starting with 0d92147813eca615deddd9ddac70e807af1257207c869407aa05fb4bc2b3882b not found: ID does not exist" Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.221362 4918 scope.go:117] "RemoveContainer" containerID="7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6" Mar 19 17:31:02 crc kubenswrapper[4918]: E0319 17:31:02.221699 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6\": container with ID starting with 7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6 not found: ID does not exist" containerID="7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6" Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.221783 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6"} err="failed to get container status \"7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6\": rpc error: code = NotFound desc = could not find container \"7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6\": container with ID starting with 7dc567d4fcf9a1a96757dd6d719662b6247202f7460e04e60a910ad0f69e09d6 not found: ID does not exist" Mar 19 17:31:02 crc kubenswrapper[4918]: I0319 17:31:02.598292 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6148e943-dd16-4749-b864-4cf2213812b7" path="/var/lib/kubelet/pods/6148e943-dd16-4749-b864-4cf2213812b7/volumes" Mar 19 17:31:06 crc kubenswrapper[4918]: I0319 17:31:06.448060 4918 scope.go:117] "RemoveContainer" containerID="3a36dde2dafa342ff58e60b15d7a78f74b20e7fc485b991ead124dda1d032a2c" Mar 19 17:31:11 crc kubenswrapper[4918]: I0319 17:31:11.587249 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:31:11 crc kubenswrapper[4918]: E0319 17:31:11.588310 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:31:23 crc kubenswrapper[4918]: I0319 17:31:23.586414 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:31:23 crc kubenswrapper[4918]: E0319 17:31:23.587323 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:31:37 crc kubenswrapper[4918]: I0319 17:31:37.587197 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:31:37 crc kubenswrapper[4918]: E0319 17:31:37.588345 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:31:50 crc kubenswrapper[4918]: I0319 17:31:50.587729 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:31:50 crc kubenswrapper[4918]: E0319 17:31:50.588596 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.145325 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565692-hg9vg"] Mar 19 17:32:00 crc kubenswrapper[4918]: E0319 17:32:00.146510 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6148e943-dd16-4749-b864-4cf2213812b7" containerName="registry-server" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.146555 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6148e943-dd16-4749-b864-4cf2213812b7" containerName="registry-server" Mar 19 17:32:00 crc kubenswrapper[4918]: E0319 17:32:00.146594 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6148e943-dd16-4749-b864-4cf2213812b7" containerName="extract-content" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.146602 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6148e943-dd16-4749-b864-4cf2213812b7" containerName="extract-content" Mar 19 17:32:00 crc kubenswrapper[4918]: E0319 17:32:00.146647 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6148e943-dd16-4749-b864-4cf2213812b7" containerName="extract-utilities" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.146657 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6148e943-dd16-4749-b864-4cf2213812b7" containerName="extract-utilities" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.146910 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="6148e943-dd16-4749-b864-4cf2213812b7" containerName="registry-server" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.147925 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565692-hg9vg" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.150068 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.150413 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.150932 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.154612 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565692-hg9vg"] Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.228283 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74gtd\" (UniqueName: \"kubernetes.io/projected/123d818a-f108-47cd-bb91-152d65a70514-kube-api-access-74gtd\") pod \"auto-csr-approver-29565692-hg9vg\" (UID: \"123d818a-f108-47cd-bb91-152d65a70514\") " pod="openshift-infra/auto-csr-approver-29565692-hg9vg" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.329653 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74gtd\" (UniqueName: \"kubernetes.io/projected/123d818a-f108-47cd-bb91-152d65a70514-kube-api-access-74gtd\") pod \"auto-csr-approver-29565692-hg9vg\" (UID: \"123d818a-f108-47cd-bb91-152d65a70514\") " pod="openshift-infra/auto-csr-approver-29565692-hg9vg" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.352307 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74gtd\" (UniqueName: \"kubernetes.io/projected/123d818a-f108-47cd-bb91-152d65a70514-kube-api-access-74gtd\") pod \"auto-csr-approver-29565692-hg9vg\" (UID: \"123d818a-f108-47cd-bb91-152d65a70514\") " pod="openshift-infra/auto-csr-approver-29565692-hg9vg" Mar 19 17:32:00 crc kubenswrapper[4918]: I0319 17:32:00.470090 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565692-hg9vg" Mar 19 17:32:01 crc kubenswrapper[4918]: I0319 17:32:01.003370 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565692-hg9vg"] Mar 19 17:32:01 crc kubenswrapper[4918]: I0319 17:32:01.850192 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565692-hg9vg" event={"ID":"123d818a-f108-47cd-bb91-152d65a70514","Type":"ContainerStarted","Data":"ed7109cc7e3659e518bde6654b15d0a9b35459f09fdfcddbc84ac62c4af664e5"} Mar 19 17:32:02 crc kubenswrapper[4918]: I0319 17:32:02.587151 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:32:02 crc kubenswrapper[4918]: E0319 17:32:02.588062 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:32:02 crc kubenswrapper[4918]: I0319 17:32:02.865150 4918 generic.go:334] "Generic (PLEG): container finished" podID="123d818a-f108-47cd-bb91-152d65a70514" containerID="0a92ba7fe9ee9786481bdeb87bebc028700dc93cd0913e462e7e6779e8703039" exitCode=0 Mar 19 17:32:02 crc kubenswrapper[4918]: I0319 17:32:02.865235 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565692-hg9vg" event={"ID":"123d818a-f108-47cd-bb91-152d65a70514","Type":"ContainerDied","Data":"0a92ba7fe9ee9786481bdeb87bebc028700dc93cd0913e462e7e6779e8703039"} Mar 19 17:32:04 crc kubenswrapper[4918]: I0319 17:32:04.331340 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565692-hg9vg" Mar 19 17:32:04 crc kubenswrapper[4918]: I0319 17:32:04.435730 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74gtd\" (UniqueName: \"kubernetes.io/projected/123d818a-f108-47cd-bb91-152d65a70514-kube-api-access-74gtd\") pod \"123d818a-f108-47cd-bb91-152d65a70514\" (UID: \"123d818a-f108-47cd-bb91-152d65a70514\") " Mar 19 17:32:04 crc kubenswrapper[4918]: I0319 17:32:04.442572 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/123d818a-f108-47cd-bb91-152d65a70514-kube-api-access-74gtd" (OuterVolumeSpecName: "kube-api-access-74gtd") pod "123d818a-f108-47cd-bb91-152d65a70514" (UID: "123d818a-f108-47cd-bb91-152d65a70514"). InnerVolumeSpecName "kube-api-access-74gtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:32:04 crc kubenswrapper[4918]: I0319 17:32:04.538881 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74gtd\" (UniqueName: \"kubernetes.io/projected/123d818a-f108-47cd-bb91-152d65a70514-kube-api-access-74gtd\") on node \"crc\" DevicePath \"\"" Mar 19 17:32:04 crc kubenswrapper[4918]: I0319 17:32:04.888709 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565692-hg9vg" event={"ID":"123d818a-f108-47cd-bb91-152d65a70514","Type":"ContainerDied","Data":"ed7109cc7e3659e518bde6654b15d0a9b35459f09fdfcddbc84ac62c4af664e5"} Mar 19 17:32:04 crc kubenswrapper[4918]: I0319 17:32:04.888763 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed7109cc7e3659e518bde6654b15d0a9b35459f09fdfcddbc84ac62c4af664e5" Mar 19 17:32:04 crc kubenswrapper[4918]: I0319 17:32:04.888811 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565692-hg9vg" Mar 19 17:32:05 crc kubenswrapper[4918]: I0319 17:32:05.432571 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565686-d66sh"] Mar 19 17:32:05 crc kubenswrapper[4918]: I0319 17:32:05.450101 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565686-d66sh"] Mar 19 17:32:06 crc kubenswrapper[4918]: I0319 17:32:06.598669 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f6939f-8031-432c-8407-277c5d8ff9a8" path="/var/lib/kubelet/pods/28f6939f-8031-432c-8407-277c5d8ff9a8/volumes" Mar 19 17:32:15 crc kubenswrapper[4918]: I0319 17:32:15.587787 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:32:15 crc kubenswrapper[4918]: E0319 17:32:15.588776 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:32:29 crc kubenswrapper[4918]: I0319 17:32:29.587225 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:32:29 crc kubenswrapper[4918]: E0319 17:32:29.588780 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:32:40 crc kubenswrapper[4918]: I0319 17:32:40.587092 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:32:40 crc kubenswrapper[4918]: E0319 17:32:40.587879 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:32:54 crc kubenswrapper[4918]: I0319 17:32:54.587233 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:32:54 crc kubenswrapper[4918]: E0319 17:32:54.588203 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:33:06 crc kubenswrapper[4918]: I0319 17:33:06.562232 4918 scope.go:117] "RemoveContainer" containerID="cc97be739660cb40f7a2f1ea0e6e88d23f8949604149b0e30c9f6f7878a6addf" Mar 19 17:33:07 crc kubenswrapper[4918]: I0319 17:33:07.586688 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:33:07 crc kubenswrapper[4918]: E0319 17:33:07.587370 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:33:18 crc kubenswrapper[4918]: I0319 17:33:18.605934 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:33:18 crc kubenswrapper[4918]: E0319 17:33:18.607260 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:33:30 crc kubenswrapper[4918]: I0319 17:33:30.586486 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:33:30 crc kubenswrapper[4918]: E0319 17:33:30.587404 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:33:43 crc kubenswrapper[4918]: I0319 17:33:43.586603 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:33:43 crc kubenswrapper[4918]: E0319 17:33:43.587324 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:33:54 crc kubenswrapper[4918]: I0319 17:33:54.586357 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:33:54 crc kubenswrapper[4918]: E0319 17:33:54.587358 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.195502 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565694-zn7kn"] Mar 19 17:34:00 crc kubenswrapper[4918]: E0319 17:34:00.196369 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="123d818a-f108-47cd-bb91-152d65a70514" containerName="oc" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.196381 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="123d818a-f108-47cd-bb91-152d65a70514" containerName="oc" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.196689 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="123d818a-f108-47cd-bb91-152d65a70514" containerName="oc" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.197372 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565694-zn7kn" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.199167 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.199423 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.199645 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.219098 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565694-zn7kn"] Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.328564 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlw62\" (UniqueName: \"kubernetes.io/projected/cc1c9cf5-7b95-4520-966b-def88bbe834e-kube-api-access-vlw62\") pod \"auto-csr-approver-29565694-zn7kn\" (UID: \"cc1c9cf5-7b95-4520-966b-def88bbe834e\") " pod="openshift-infra/auto-csr-approver-29565694-zn7kn" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.431415 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlw62\" (UniqueName: \"kubernetes.io/projected/cc1c9cf5-7b95-4520-966b-def88bbe834e-kube-api-access-vlw62\") pod \"auto-csr-approver-29565694-zn7kn\" (UID: \"cc1c9cf5-7b95-4520-966b-def88bbe834e\") " pod="openshift-infra/auto-csr-approver-29565694-zn7kn" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.450914 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlw62\" (UniqueName: \"kubernetes.io/projected/cc1c9cf5-7b95-4520-966b-def88bbe834e-kube-api-access-vlw62\") pod \"auto-csr-approver-29565694-zn7kn\" (UID: \"cc1c9cf5-7b95-4520-966b-def88bbe834e\") " pod="openshift-infra/auto-csr-approver-29565694-zn7kn" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.514298 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565694-zn7kn" Mar 19 17:34:00 crc kubenswrapper[4918]: I0319 17:34:00.969157 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565694-zn7kn"] Mar 19 17:34:01 crc kubenswrapper[4918]: I0319 17:34:01.233329 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565694-zn7kn" event={"ID":"cc1c9cf5-7b95-4520-966b-def88bbe834e","Type":"ContainerStarted","Data":"2a0760a94b2db79c9693175eeef01c6e6c6ae125ac2753e28a919ce2a6d208eb"} Mar 19 17:34:02 crc kubenswrapper[4918]: I0319 17:34:02.245430 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565694-zn7kn" event={"ID":"cc1c9cf5-7b95-4520-966b-def88bbe834e","Type":"ContainerStarted","Data":"f7660c1f110672d36792a552a2311c65d06c017664982e983e892828a1a2e22b"} Mar 19 17:34:02 crc kubenswrapper[4918]: I0319 17:34:02.277300 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565694-zn7kn" podStartSLOduration=1.38165124 podStartE2EDuration="2.277273578s" podCreationTimestamp="2026-03-19 17:34:00 +0000 UTC" firstStartedPulling="2026-03-19 17:34:00.973634375 +0000 UTC m=+3253.095833623" lastFinishedPulling="2026-03-19 17:34:01.869256703 +0000 UTC m=+3253.991455961" observedRunningTime="2026-03-19 17:34:02.262336129 +0000 UTC m=+3254.384535367" watchObservedRunningTime="2026-03-19 17:34:02.277273578 +0000 UTC m=+3254.399472836" Mar 19 17:34:03 crc kubenswrapper[4918]: I0319 17:34:03.258284 4918 generic.go:334] "Generic (PLEG): container finished" podID="cc1c9cf5-7b95-4520-966b-def88bbe834e" containerID="f7660c1f110672d36792a552a2311c65d06c017664982e983e892828a1a2e22b" exitCode=0 Mar 19 17:34:03 crc kubenswrapper[4918]: I0319 17:34:03.258430 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565694-zn7kn" event={"ID":"cc1c9cf5-7b95-4520-966b-def88bbe834e","Type":"ContainerDied","Data":"f7660c1f110672d36792a552a2311c65d06c017664982e983e892828a1a2e22b"} Mar 19 17:34:04 crc kubenswrapper[4918]: I0319 17:34:04.715213 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565694-zn7kn" Mar 19 17:34:04 crc kubenswrapper[4918]: I0319 17:34:04.838864 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlw62\" (UniqueName: \"kubernetes.io/projected/cc1c9cf5-7b95-4520-966b-def88bbe834e-kube-api-access-vlw62\") pod \"cc1c9cf5-7b95-4520-966b-def88bbe834e\" (UID: \"cc1c9cf5-7b95-4520-966b-def88bbe834e\") " Mar 19 17:34:04 crc kubenswrapper[4918]: I0319 17:34:04.846335 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc1c9cf5-7b95-4520-966b-def88bbe834e-kube-api-access-vlw62" (OuterVolumeSpecName: "kube-api-access-vlw62") pod "cc1c9cf5-7b95-4520-966b-def88bbe834e" (UID: "cc1c9cf5-7b95-4520-966b-def88bbe834e"). InnerVolumeSpecName "kube-api-access-vlw62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:34:04 crc kubenswrapper[4918]: I0319 17:34:04.941999 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlw62\" (UniqueName: \"kubernetes.io/projected/cc1c9cf5-7b95-4520-966b-def88bbe834e-kube-api-access-vlw62\") on node \"crc\" DevicePath \"\"" Mar 19 17:34:05 crc kubenswrapper[4918]: I0319 17:34:05.287753 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565694-zn7kn" event={"ID":"cc1c9cf5-7b95-4520-966b-def88bbe834e","Type":"ContainerDied","Data":"2a0760a94b2db79c9693175eeef01c6e6c6ae125ac2753e28a919ce2a6d208eb"} Mar 19 17:34:05 crc kubenswrapper[4918]: I0319 17:34:05.287801 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a0760a94b2db79c9693175eeef01c6e6c6ae125ac2753e28a919ce2a6d208eb" Mar 19 17:34:05 crc kubenswrapper[4918]: I0319 17:34:05.287816 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565694-zn7kn" Mar 19 17:34:05 crc kubenswrapper[4918]: I0319 17:34:05.366509 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565688-nwq7n"] Mar 19 17:34:05 crc kubenswrapper[4918]: I0319 17:34:05.377472 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565688-nwq7n"] Mar 19 17:34:06 crc kubenswrapper[4918]: I0319 17:34:06.605860 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a44fd43-7d87-4a43-af64-a1395c883e33" path="/var/lib/kubelet/pods/6a44fd43-7d87-4a43-af64-a1395c883e33/volumes" Mar 19 17:34:06 crc kubenswrapper[4918]: I0319 17:34:06.661432 4918 scope.go:117] "RemoveContainer" containerID="1c545dd37513c59a679492fcafdb48210519e043e6f091078003aa42d906f76e" Mar 19 17:34:08 crc kubenswrapper[4918]: I0319 17:34:08.587689 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:34:08 crc kubenswrapper[4918]: E0319 17:34:08.590059 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.177635 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rv5gc"] Mar 19 17:34:17 crc kubenswrapper[4918]: E0319 17:34:17.178471 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1c9cf5-7b95-4520-966b-def88bbe834e" containerName="oc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.178487 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1c9cf5-7b95-4520-966b-def88bbe834e" containerName="oc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.178852 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc1c9cf5-7b95-4520-966b-def88bbe834e" containerName="oc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.182295 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.189291 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rv5gc"] Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.276417 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trhzg\" (UniqueName: \"kubernetes.io/projected/6cd64f2a-ce0f-48ae-a214-21654e1c715a-kube-api-access-trhzg\") pod \"community-operators-rv5gc\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.276616 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-utilities\") pod \"community-operators-rv5gc\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.276694 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-catalog-content\") pod \"community-operators-rv5gc\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.378610 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-utilities\") pod \"community-operators-rv5gc\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.378759 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-catalog-content\") pod \"community-operators-rv5gc\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.378924 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trhzg\" (UniqueName: \"kubernetes.io/projected/6cd64f2a-ce0f-48ae-a214-21654e1c715a-kube-api-access-trhzg\") pod \"community-operators-rv5gc\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.379169 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-utilities\") pod \"community-operators-rv5gc\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.379575 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-catalog-content\") pod \"community-operators-rv5gc\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.398095 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trhzg\" (UniqueName: \"kubernetes.io/projected/6cd64f2a-ce0f-48ae-a214-21654e1c715a-kube-api-access-trhzg\") pod \"community-operators-rv5gc\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:17 crc kubenswrapper[4918]: I0319 17:34:17.508248 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:18 crc kubenswrapper[4918]: I0319 17:34:18.067950 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rv5gc"] Mar 19 17:34:18 crc kubenswrapper[4918]: W0319 17:34:18.074941 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cd64f2a_ce0f_48ae_a214_21654e1c715a.slice/crio-7444289cc2922bbc3c4ab9f888351319656196a869cfe5b3f5be3e93296341dc WatchSource:0}: Error finding container 7444289cc2922bbc3c4ab9f888351319656196a869cfe5b3f5be3e93296341dc: Status 404 returned error can't find the container with id 7444289cc2922bbc3c4ab9f888351319656196a869cfe5b3f5be3e93296341dc Mar 19 17:34:18 crc kubenswrapper[4918]: I0319 17:34:18.445882 4918 generic.go:334] "Generic (PLEG): container finished" podID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" containerID="d99b5bdc1f5b5ed685cfe7ea6fded782cdd9d1e72959c03c1690177f1abe59f3" exitCode=0 Mar 19 17:34:18 crc kubenswrapper[4918]: I0319 17:34:18.446140 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5gc" event={"ID":"6cd64f2a-ce0f-48ae-a214-21654e1c715a","Type":"ContainerDied","Data":"d99b5bdc1f5b5ed685cfe7ea6fded782cdd9d1e72959c03c1690177f1abe59f3"} Mar 19 17:34:18 crc kubenswrapper[4918]: I0319 17:34:18.446912 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5gc" event={"ID":"6cd64f2a-ce0f-48ae-a214-21654e1c715a","Type":"ContainerStarted","Data":"7444289cc2922bbc3c4ab9f888351319656196a869cfe5b3f5be3e93296341dc"} Mar 19 17:34:19 crc kubenswrapper[4918]: I0319 17:34:19.462017 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5gc" event={"ID":"6cd64f2a-ce0f-48ae-a214-21654e1c715a","Type":"ContainerStarted","Data":"d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49"} Mar 19 17:34:20 crc kubenswrapper[4918]: I0319 17:34:20.478684 4918 generic.go:334] "Generic (PLEG): container finished" podID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" containerID="d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49" exitCode=0 Mar 19 17:34:20 crc kubenswrapper[4918]: I0319 17:34:20.478760 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5gc" event={"ID":"6cd64f2a-ce0f-48ae-a214-21654e1c715a","Type":"ContainerDied","Data":"d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49"} Mar 19 17:34:21 crc kubenswrapper[4918]: I0319 17:34:21.488399 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5gc" event={"ID":"6cd64f2a-ce0f-48ae-a214-21654e1c715a","Type":"ContainerStarted","Data":"2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1"} Mar 19 17:34:21 crc kubenswrapper[4918]: I0319 17:34:21.518286 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rv5gc" podStartSLOduration=1.7890959 podStartE2EDuration="4.518268185s" podCreationTimestamp="2026-03-19 17:34:17 +0000 UTC" firstStartedPulling="2026-03-19 17:34:18.448155672 +0000 UTC m=+3270.570354970" lastFinishedPulling="2026-03-19 17:34:21.177328007 +0000 UTC m=+3273.299527255" observedRunningTime="2026-03-19 17:34:21.515036866 +0000 UTC m=+3273.637236114" watchObservedRunningTime="2026-03-19 17:34:21.518268185 +0000 UTC m=+3273.640467433" Mar 19 17:34:23 crc kubenswrapper[4918]: I0319 17:34:23.587044 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:34:23 crc kubenswrapper[4918]: E0319 17:34:23.587701 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:34:27 crc kubenswrapper[4918]: I0319 17:34:27.509381 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:27 crc kubenswrapper[4918]: I0319 17:34:27.511969 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:27 crc kubenswrapper[4918]: I0319 17:34:27.607871 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:28 crc kubenswrapper[4918]: I0319 17:34:28.646341 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:28 crc kubenswrapper[4918]: I0319 17:34:28.710163 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rv5gc"] Mar 19 17:34:30 crc kubenswrapper[4918]: I0319 17:34:30.587187 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rv5gc" podUID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" containerName="registry-server" containerID="cri-o://2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1" gracePeriod=2 Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.116451 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.232305 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-utilities\") pod \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.232458 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trhzg\" (UniqueName: \"kubernetes.io/projected/6cd64f2a-ce0f-48ae-a214-21654e1c715a-kube-api-access-trhzg\") pod \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.232615 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-catalog-content\") pod \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\" (UID: \"6cd64f2a-ce0f-48ae-a214-21654e1c715a\") " Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.235025 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-utilities" (OuterVolumeSpecName: "utilities") pod "6cd64f2a-ce0f-48ae-a214-21654e1c715a" (UID: "6cd64f2a-ce0f-48ae-a214-21654e1c715a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.239008 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd64f2a-ce0f-48ae-a214-21654e1c715a-kube-api-access-trhzg" (OuterVolumeSpecName: "kube-api-access-trhzg") pod "6cd64f2a-ce0f-48ae-a214-21654e1c715a" (UID: "6cd64f2a-ce0f-48ae-a214-21654e1c715a"). InnerVolumeSpecName "kube-api-access-trhzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.283543 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cd64f2a-ce0f-48ae-a214-21654e1c715a" (UID: "6cd64f2a-ce0f-48ae-a214-21654e1c715a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.336391 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trhzg\" (UniqueName: \"kubernetes.io/projected/6cd64f2a-ce0f-48ae-a214-21654e1c715a-kube-api-access-trhzg\") on node \"crc\" DevicePath \"\"" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.336683 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.336747 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cd64f2a-ce0f-48ae-a214-21654e1c715a-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.602921 4918 generic.go:334] "Generic (PLEG): container finished" podID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" containerID="2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1" exitCode=0 Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.602961 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5gc" event={"ID":"6cd64f2a-ce0f-48ae-a214-21654e1c715a","Type":"ContainerDied","Data":"2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1"} Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.602988 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rv5gc" event={"ID":"6cd64f2a-ce0f-48ae-a214-21654e1c715a","Type":"ContainerDied","Data":"7444289cc2922bbc3c4ab9f888351319656196a869cfe5b3f5be3e93296341dc"} Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.603006 4918 scope.go:117] "RemoveContainer" containerID="2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.603125 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rv5gc" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.635838 4918 scope.go:117] "RemoveContainer" containerID="d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.663272 4918 scope.go:117] "RemoveContainer" containerID="d99b5bdc1f5b5ed685cfe7ea6fded782cdd9d1e72959c03c1690177f1abe59f3" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.663763 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rv5gc"] Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.683499 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rv5gc"] Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.733913 4918 scope.go:117] "RemoveContainer" containerID="2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1" Mar 19 17:34:31 crc kubenswrapper[4918]: E0319 17:34:31.745129 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1\": container with ID starting with 2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1 not found: ID does not exist" containerID="2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.745182 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1"} err="failed to get container status \"2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1\": rpc error: code = NotFound desc = could not find container \"2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1\": container with ID starting with 2017e5e6fba0d58bf2f7eef0a43c0f10c99dd19e41b1eb6ea48dbe3bda5d91c1 not found: ID does not exist" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.745211 4918 scope.go:117] "RemoveContainer" containerID="d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49" Mar 19 17:34:31 crc kubenswrapper[4918]: E0319 17:34:31.745470 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49\": container with ID starting with d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49 not found: ID does not exist" containerID="d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.745500 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49"} err="failed to get container status \"d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49\": rpc error: code = NotFound desc = could not find container \"d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49\": container with ID starting with d8a72103274bee5e4cff2ca6749a1409ff04640c9afd6e2fc0d5db8a22a80b49 not found: ID does not exist" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.745535 4918 scope.go:117] "RemoveContainer" containerID="d99b5bdc1f5b5ed685cfe7ea6fded782cdd9d1e72959c03c1690177f1abe59f3" Mar 19 17:34:31 crc kubenswrapper[4918]: E0319 17:34:31.746074 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99b5bdc1f5b5ed685cfe7ea6fded782cdd9d1e72959c03c1690177f1abe59f3\": container with ID starting with d99b5bdc1f5b5ed685cfe7ea6fded782cdd9d1e72959c03c1690177f1abe59f3 not found: ID does not exist" containerID="d99b5bdc1f5b5ed685cfe7ea6fded782cdd9d1e72959c03c1690177f1abe59f3" Mar 19 17:34:31 crc kubenswrapper[4918]: I0319 17:34:31.746102 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99b5bdc1f5b5ed685cfe7ea6fded782cdd9d1e72959c03c1690177f1abe59f3"} err="failed to get container status \"d99b5bdc1f5b5ed685cfe7ea6fded782cdd9d1e72959c03c1690177f1abe59f3\": rpc error: code = NotFound desc = could not find container \"d99b5bdc1f5b5ed685cfe7ea6fded782cdd9d1e72959c03c1690177f1abe59f3\": container with ID starting with d99b5bdc1f5b5ed685cfe7ea6fded782cdd9d1e72959c03c1690177f1abe59f3 not found: ID does not exist" Mar 19 17:34:32 crc kubenswrapper[4918]: I0319 17:34:32.598890 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" path="/var/lib/kubelet/pods/6cd64f2a-ce0f-48ae-a214-21654e1c715a/volumes" Mar 19 17:34:34 crc kubenswrapper[4918]: I0319 17:34:34.586349 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:34:35 crc kubenswrapper[4918]: I0319 17:34:35.653785 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"cdaeef138643fe84efa20e700cf0b17551d48acbf4e559a1c2caea3bc9ce1b8c"} Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.603621 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wbz4r"] Mar 19 17:34:51 crc kubenswrapper[4918]: E0319 17:34:51.605891 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" containerName="extract-content" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.605994 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" containerName="extract-content" Mar 19 17:34:51 crc kubenswrapper[4918]: E0319 17:34:51.606073 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" containerName="registry-server" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.606135 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" containerName="registry-server" Mar 19 17:34:51 crc kubenswrapper[4918]: E0319 17:34:51.606207 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" containerName="extract-utilities" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.606261 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" containerName="extract-utilities" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.606563 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd64f2a-ce0f-48ae-a214-21654e1c715a" containerName="registry-server" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.608323 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.614508 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbz4r"] Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.766602 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-catalog-content\") pod \"redhat-marketplace-wbz4r\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.767196 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-utilities\") pod \"redhat-marketplace-wbz4r\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.767242 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt2md\" (UniqueName: \"kubernetes.io/projected/16aaf839-8949-4aa4-a2f9-fd551fae1e85-kube-api-access-dt2md\") pod \"redhat-marketplace-wbz4r\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.869826 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-catalog-content\") pod \"redhat-marketplace-wbz4r\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.869933 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-utilities\") pod \"redhat-marketplace-wbz4r\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.869961 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt2md\" (UniqueName: \"kubernetes.io/projected/16aaf839-8949-4aa4-a2f9-fd551fae1e85-kube-api-access-dt2md\") pod \"redhat-marketplace-wbz4r\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.870475 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-catalog-content\") pod \"redhat-marketplace-wbz4r\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.870769 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-utilities\") pod \"redhat-marketplace-wbz4r\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.892147 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt2md\" (UniqueName: \"kubernetes.io/projected/16aaf839-8949-4aa4-a2f9-fd551fae1e85-kube-api-access-dt2md\") pod \"redhat-marketplace-wbz4r\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:34:51 crc kubenswrapper[4918]: I0319 17:34:51.987104 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:34:52 crc kubenswrapper[4918]: I0319 17:34:52.469398 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbz4r"] Mar 19 17:34:52 crc kubenswrapper[4918]: I0319 17:34:52.820288 4918 generic.go:334] "Generic (PLEG): container finished" podID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" containerID="972878a651e6d868edb96745b4f656b238359cd92c2b08356be899a05ea87cd3" exitCode=0 Mar 19 17:34:52 crc kubenswrapper[4918]: I0319 17:34:52.820340 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbz4r" event={"ID":"16aaf839-8949-4aa4-a2f9-fd551fae1e85","Type":"ContainerDied","Data":"972878a651e6d868edb96745b4f656b238359cd92c2b08356be899a05ea87cd3"} Mar 19 17:34:52 crc kubenswrapper[4918]: I0319 17:34:52.820366 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbz4r" event={"ID":"16aaf839-8949-4aa4-a2f9-fd551fae1e85","Type":"ContainerStarted","Data":"b143eba6431f03384bbf0d12c1be6b8caec743806e9cdd94a52b5957d43d4973"} Mar 19 17:34:55 crc kubenswrapper[4918]: E0319 17:34:55.773034 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16aaf839_8949_4aa4_a2f9_fd551fae1e85.slice/crio-conmon-15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf.scope\": RecentStats: unable to find data in memory cache]" Mar 19 17:34:55 crc kubenswrapper[4918]: I0319 17:34:55.848815 4918 generic.go:334] "Generic (PLEG): container finished" podID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" containerID="15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf" exitCode=0 Mar 19 17:34:55 crc kubenswrapper[4918]: I0319 17:34:55.848869 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbz4r" event={"ID":"16aaf839-8949-4aa4-a2f9-fd551fae1e85","Type":"ContainerDied","Data":"15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf"} Mar 19 17:34:56 crc kubenswrapper[4918]: I0319 17:34:56.865508 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbz4r" event={"ID":"16aaf839-8949-4aa4-a2f9-fd551fae1e85","Type":"ContainerStarted","Data":"0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a"} Mar 19 17:34:56 crc kubenswrapper[4918]: I0319 17:34:56.885131 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wbz4r" podStartSLOduration=2.23653392 podStartE2EDuration="5.885113416s" podCreationTimestamp="2026-03-19 17:34:51 +0000 UTC" firstStartedPulling="2026-03-19 17:34:52.822638105 +0000 UTC m=+3304.944837353" lastFinishedPulling="2026-03-19 17:34:56.471217601 +0000 UTC m=+3308.593416849" observedRunningTime="2026-03-19 17:34:56.884227493 +0000 UTC m=+3309.006426751" watchObservedRunningTime="2026-03-19 17:34:56.885113416 +0000 UTC m=+3309.007312664" Mar 19 17:35:01 crc kubenswrapper[4918]: I0319 17:35:01.989129 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:35:01 crc kubenswrapper[4918]: I0319 17:35:01.990896 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:35:02 crc kubenswrapper[4918]: I0319 17:35:02.053082 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:35:02 crc kubenswrapper[4918]: I0319 17:35:02.979012 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:35:03 crc kubenswrapper[4918]: I0319 17:35:03.044189 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbz4r"] Mar 19 17:35:04 crc kubenswrapper[4918]: I0319 17:35:04.952337 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wbz4r" podUID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" containerName="registry-server" containerID="cri-o://0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a" gracePeriod=2 Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.530245 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.671275 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt2md\" (UniqueName: \"kubernetes.io/projected/16aaf839-8949-4aa4-a2f9-fd551fae1e85-kube-api-access-dt2md\") pod \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.671495 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-utilities\") pod \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.671560 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-catalog-content\") pod \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\" (UID: \"16aaf839-8949-4aa4-a2f9-fd551fae1e85\") " Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.672329 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-utilities" (OuterVolumeSpecName: "utilities") pod "16aaf839-8949-4aa4-a2f9-fd551fae1e85" (UID: "16aaf839-8949-4aa4-a2f9-fd551fae1e85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.681389 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16aaf839-8949-4aa4-a2f9-fd551fae1e85-kube-api-access-dt2md" (OuterVolumeSpecName: "kube-api-access-dt2md") pod "16aaf839-8949-4aa4-a2f9-fd551fae1e85" (UID: "16aaf839-8949-4aa4-a2f9-fd551fae1e85"). InnerVolumeSpecName "kube-api-access-dt2md". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.773847 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.773876 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt2md\" (UniqueName: \"kubernetes.io/projected/16aaf839-8949-4aa4-a2f9-fd551fae1e85-kube-api-access-dt2md\") on node \"crc\" DevicePath \"\"" Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.824857 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16aaf839-8949-4aa4-a2f9-fd551fae1e85" (UID: "16aaf839-8949-4aa4-a2f9-fd551fae1e85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.875443 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16aaf839-8949-4aa4-a2f9-fd551fae1e85-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.961211 4918 generic.go:334] "Generic (PLEG): container finished" podID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" containerID="0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a" exitCode=0 Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.961251 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbz4r" event={"ID":"16aaf839-8949-4aa4-a2f9-fd551fae1e85","Type":"ContainerDied","Data":"0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a"} Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.961275 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbz4r" event={"ID":"16aaf839-8949-4aa4-a2f9-fd551fae1e85","Type":"ContainerDied","Data":"b143eba6431f03384bbf0d12c1be6b8caec743806e9cdd94a52b5957d43d4973"} Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.961290 4918 scope.go:117] "RemoveContainer" containerID="0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a" Mar 19 17:35:05 crc kubenswrapper[4918]: I0319 17:35:05.961407 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbz4r" Mar 19 17:35:06 crc kubenswrapper[4918]: I0319 17:35:06.012104 4918 scope.go:117] "RemoveContainer" containerID="15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf" Mar 19 17:35:06 crc kubenswrapper[4918]: I0319 17:35:06.020660 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbz4r"] Mar 19 17:35:06 crc kubenswrapper[4918]: E0319 17:35:06.025313 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16aaf839_8949_4aa4_a2f9_fd551fae1e85.slice\": RecentStats: unable to find data in memory cache]" Mar 19 17:35:06 crc kubenswrapper[4918]: I0319 17:35:06.028321 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbz4r"] Mar 19 17:35:06 crc kubenswrapper[4918]: I0319 17:35:06.035720 4918 scope.go:117] "RemoveContainer" containerID="972878a651e6d868edb96745b4f656b238359cd92c2b08356be899a05ea87cd3" Mar 19 17:35:06 crc kubenswrapper[4918]: I0319 17:35:06.112307 4918 scope.go:117] "RemoveContainer" containerID="0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a" Mar 19 17:35:06 crc kubenswrapper[4918]: E0319 17:35:06.113124 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a\": container with ID starting with 0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a not found: ID does not exist" containerID="0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a" Mar 19 17:35:06 crc kubenswrapper[4918]: I0319 17:35:06.113256 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a"} err="failed to get container status \"0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a\": rpc error: code = NotFound desc = could not find container \"0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a\": container with ID starting with 0db648f2dfac6c380f17b09f01458602d41269136f3ab9718b71343d873e678a not found: ID does not exist" Mar 19 17:35:06 crc kubenswrapper[4918]: I0319 17:35:06.113360 4918 scope.go:117] "RemoveContainer" containerID="15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf" Mar 19 17:35:06 crc kubenswrapper[4918]: E0319 17:35:06.113978 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf\": container with ID starting with 15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf not found: ID does not exist" containerID="15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf" Mar 19 17:35:06 crc kubenswrapper[4918]: I0319 17:35:06.114003 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf"} err="failed to get container status \"15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf\": rpc error: code = NotFound desc = could not find container \"15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf\": container with ID starting with 15d466767b3259e273af83b2a1ab15ccd06b8349e780995c3f1ea4c939e9b1cf not found: ID does not exist" Mar 19 17:35:06 crc kubenswrapper[4918]: I0319 17:35:06.114019 4918 scope.go:117] "RemoveContainer" containerID="972878a651e6d868edb96745b4f656b238359cd92c2b08356be899a05ea87cd3" Mar 19 17:35:06 crc kubenswrapper[4918]: E0319 17:35:06.114285 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972878a651e6d868edb96745b4f656b238359cd92c2b08356be899a05ea87cd3\": container with ID starting with 972878a651e6d868edb96745b4f656b238359cd92c2b08356be899a05ea87cd3 not found: ID does not exist" containerID="972878a651e6d868edb96745b4f656b238359cd92c2b08356be899a05ea87cd3" Mar 19 17:35:06 crc kubenswrapper[4918]: I0319 17:35:06.114378 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972878a651e6d868edb96745b4f656b238359cd92c2b08356be899a05ea87cd3"} err="failed to get container status \"972878a651e6d868edb96745b4f656b238359cd92c2b08356be899a05ea87cd3\": rpc error: code = NotFound desc = could not find container \"972878a651e6d868edb96745b4f656b238359cd92c2b08356be899a05ea87cd3\": container with ID starting with 972878a651e6d868edb96745b4f656b238359cd92c2b08356be899a05ea87cd3 not found: ID does not exist" Mar 19 17:35:06 crc kubenswrapper[4918]: I0319 17:35:06.607795 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" path="/var/lib/kubelet/pods/16aaf839-8949-4aa4-a2f9-fd551fae1e85/volumes" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.153991 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565696-ppctl"] Mar 19 17:36:00 crc kubenswrapper[4918]: E0319 17:36:00.155190 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" containerName="registry-server" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.155212 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" containerName="registry-server" Mar 19 17:36:00 crc kubenswrapper[4918]: E0319 17:36:00.155272 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" containerName="extract-content" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.155285 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" containerName="extract-content" Mar 19 17:36:00 crc kubenswrapper[4918]: E0319 17:36:00.155312 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" containerName="extract-utilities" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.155324 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" containerName="extract-utilities" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.155726 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="16aaf839-8949-4aa4-a2f9-fd551fae1e85" containerName="registry-server" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.156790 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565696-ppctl" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.158978 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.159691 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.159991 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.163586 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565696-ppctl"] Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.276573 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnw8x\" (UniqueName: \"kubernetes.io/projected/ddf116de-e4ab-4043-818c-b904883644f2-kube-api-access-fnw8x\") pod \"auto-csr-approver-29565696-ppctl\" (UID: \"ddf116de-e4ab-4043-818c-b904883644f2\") " pod="openshift-infra/auto-csr-approver-29565696-ppctl" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.378588 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnw8x\" (UniqueName: \"kubernetes.io/projected/ddf116de-e4ab-4043-818c-b904883644f2-kube-api-access-fnw8x\") pod \"auto-csr-approver-29565696-ppctl\" (UID: \"ddf116de-e4ab-4043-818c-b904883644f2\") " pod="openshift-infra/auto-csr-approver-29565696-ppctl" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.397342 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnw8x\" (UniqueName: \"kubernetes.io/projected/ddf116de-e4ab-4043-818c-b904883644f2-kube-api-access-fnw8x\") pod \"auto-csr-approver-29565696-ppctl\" (UID: \"ddf116de-e4ab-4043-818c-b904883644f2\") " pod="openshift-infra/auto-csr-approver-29565696-ppctl" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.479620 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565696-ppctl" Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.924723 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 17:36:00 crc kubenswrapper[4918]: I0319 17:36:00.928408 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565696-ppctl"] Mar 19 17:36:01 crc kubenswrapper[4918]: I0319 17:36:01.642388 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565696-ppctl" event={"ID":"ddf116de-e4ab-4043-818c-b904883644f2","Type":"ContainerStarted","Data":"608f8a60458ee3bda34186091994df354dd232fb976ccf59ba1b2cddc5c51ada"} Mar 19 17:36:02 crc kubenswrapper[4918]: I0319 17:36:02.659631 4918 generic.go:334] "Generic (PLEG): container finished" podID="ddf116de-e4ab-4043-818c-b904883644f2" containerID="bb134bec25153a1bb8ab3784512ebc2401a9e748fbc5da9e1fff43f403fd4e36" exitCode=0 Mar 19 17:36:02 crc kubenswrapper[4918]: I0319 17:36:02.659729 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565696-ppctl" event={"ID":"ddf116de-e4ab-4043-818c-b904883644f2","Type":"ContainerDied","Data":"bb134bec25153a1bb8ab3784512ebc2401a9e748fbc5da9e1fff43f403fd4e36"} Mar 19 17:36:04 crc kubenswrapper[4918]: I0319 17:36:04.107587 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565696-ppctl" Mar 19 17:36:04 crc kubenswrapper[4918]: I0319 17:36:04.267756 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnw8x\" (UniqueName: \"kubernetes.io/projected/ddf116de-e4ab-4043-818c-b904883644f2-kube-api-access-fnw8x\") pod \"ddf116de-e4ab-4043-818c-b904883644f2\" (UID: \"ddf116de-e4ab-4043-818c-b904883644f2\") " Mar 19 17:36:04 crc kubenswrapper[4918]: I0319 17:36:04.273971 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf116de-e4ab-4043-818c-b904883644f2-kube-api-access-fnw8x" (OuterVolumeSpecName: "kube-api-access-fnw8x") pod "ddf116de-e4ab-4043-818c-b904883644f2" (UID: "ddf116de-e4ab-4043-818c-b904883644f2"). InnerVolumeSpecName "kube-api-access-fnw8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:36:04 crc kubenswrapper[4918]: I0319 17:36:04.370975 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnw8x\" (UniqueName: \"kubernetes.io/projected/ddf116de-e4ab-4043-818c-b904883644f2-kube-api-access-fnw8x\") on node \"crc\" DevicePath \"\"" Mar 19 17:36:04 crc kubenswrapper[4918]: I0319 17:36:04.684716 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565696-ppctl" event={"ID":"ddf116de-e4ab-4043-818c-b904883644f2","Type":"ContainerDied","Data":"608f8a60458ee3bda34186091994df354dd232fb976ccf59ba1b2cddc5c51ada"} Mar 19 17:36:04 crc kubenswrapper[4918]: I0319 17:36:04.684787 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="608f8a60458ee3bda34186091994df354dd232fb976ccf59ba1b2cddc5c51ada" Mar 19 17:36:04 crc kubenswrapper[4918]: I0319 17:36:04.684841 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565696-ppctl" Mar 19 17:36:05 crc kubenswrapper[4918]: I0319 17:36:05.203028 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565690-2w68q"] Mar 19 17:36:05 crc kubenswrapper[4918]: I0319 17:36:05.212646 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565690-2w68q"] Mar 19 17:36:06 crc kubenswrapper[4918]: I0319 17:36:06.599929 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d626d9fc-ecbb-46ff-93ec-39cc893260a7" path="/var/lib/kubelet/pods/d626d9fc-ecbb-46ff-93ec-39cc893260a7/volumes" Mar 19 17:36:06 crc kubenswrapper[4918]: I0319 17:36:06.865993 4918 scope.go:117] "RemoveContainer" containerID="a8def4f6d570b60f5b868869a17a9ad8e1b9ddf330c3f171ee7c92b1c9e7445e" Mar 19 17:36:58 crc kubenswrapper[4918]: I0319 17:36:58.212528 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:36:58 crc kubenswrapper[4918]: I0319 17:36:58.213276 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.262147 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rnpvv"] Mar 19 17:37:09 crc kubenswrapper[4918]: E0319 17:37:09.263457 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf116de-e4ab-4043-818c-b904883644f2" containerName="oc" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.263476 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf116de-e4ab-4043-818c-b904883644f2" containerName="oc" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.263813 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf116de-e4ab-4043-818c-b904883644f2" containerName="oc" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.278672 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rnpvv"] Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.278773 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.330317 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mwvr\" (UniqueName: \"kubernetes.io/projected/09d75ec5-28d1-45c4-bc97-713d00455796-kube-api-access-6mwvr\") pod \"certified-operators-rnpvv\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.331001 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-utilities\") pod \"certified-operators-rnpvv\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.331066 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-catalog-content\") pod \"certified-operators-rnpvv\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.433663 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-utilities\") pod \"certified-operators-rnpvv\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.433742 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-catalog-content\") pod \"certified-operators-rnpvv\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.433836 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mwvr\" (UniqueName: \"kubernetes.io/projected/09d75ec5-28d1-45c4-bc97-713d00455796-kube-api-access-6mwvr\") pod \"certified-operators-rnpvv\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.434234 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-utilities\") pod \"certified-operators-rnpvv\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.434275 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-catalog-content\") pod \"certified-operators-rnpvv\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.455480 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mwvr\" (UniqueName: \"kubernetes.io/projected/09d75ec5-28d1-45c4-bc97-713d00455796-kube-api-access-6mwvr\") pod \"certified-operators-rnpvv\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:09 crc kubenswrapper[4918]: I0319 17:37:09.647283 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:10 crc kubenswrapper[4918]: I0319 17:37:10.135156 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rnpvv"] Mar 19 17:37:10 crc kubenswrapper[4918]: I0319 17:37:10.501906 4918 generic.go:334] "Generic (PLEG): container finished" podID="09d75ec5-28d1-45c4-bc97-713d00455796" containerID="da062de9b2ad5bf9b29d2e2625387fc50694c51054f2bc32f1520c3409b7af2f" exitCode=0 Mar 19 17:37:10 crc kubenswrapper[4918]: I0319 17:37:10.501972 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnpvv" event={"ID":"09d75ec5-28d1-45c4-bc97-713d00455796","Type":"ContainerDied","Data":"da062de9b2ad5bf9b29d2e2625387fc50694c51054f2bc32f1520c3409b7af2f"} Mar 19 17:37:10 crc kubenswrapper[4918]: I0319 17:37:10.502011 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnpvv" event={"ID":"09d75ec5-28d1-45c4-bc97-713d00455796","Type":"ContainerStarted","Data":"ba6fdc00801ad3f0a73579b348396b6a011daed73222d343d2a83a221f3e0399"} Mar 19 17:37:12 crc kubenswrapper[4918]: I0319 17:37:12.535614 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnpvv" event={"ID":"09d75ec5-28d1-45c4-bc97-713d00455796","Type":"ContainerStarted","Data":"17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3"} Mar 19 17:37:13 crc kubenswrapper[4918]: I0319 17:37:13.556511 4918 generic.go:334] "Generic (PLEG): container finished" podID="09d75ec5-28d1-45c4-bc97-713d00455796" containerID="17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3" exitCode=0 Mar 19 17:37:13 crc kubenswrapper[4918]: I0319 17:37:13.556770 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnpvv" event={"ID":"09d75ec5-28d1-45c4-bc97-713d00455796","Type":"ContainerDied","Data":"17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3"} Mar 19 17:37:14 crc kubenswrapper[4918]: I0319 17:37:14.578658 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnpvv" event={"ID":"09d75ec5-28d1-45c4-bc97-713d00455796","Type":"ContainerStarted","Data":"ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c"} Mar 19 17:37:14 crc kubenswrapper[4918]: I0319 17:37:14.607504 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rnpvv" podStartSLOduration=2.123396765 podStartE2EDuration="5.607473292s" podCreationTimestamp="2026-03-19 17:37:09 +0000 UTC" firstStartedPulling="2026-03-19 17:37:10.503360537 +0000 UTC m=+3442.625559785" lastFinishedPulling="2026-03-19 17:37:13.987437024 +0000 UTC m=+3446.109636312" observedRunningTime="2026-03-19 17:37:14.602040743 +0000 UTC m=+3446.724240021" watchObservedRunningTime="2026-03-19 17:37:14.607473292 +0000 UTC m=+3446.729672560" Mar 19 17:37:19 crc kubenswrapper[4918]: I0319 17:37:19.647594 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:19 crc kubenswrapper[4918]: I0319 17:37:19.648355 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:19 crc kubenswrapper[4918]: I0319 17:37:19.704984 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:20 crc kubenswrapper[4918]: I0319 17:37:20.707949 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:20 crc kubenswrapper[4918]: I0319 17:37:20.774065 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rnpvv"] Mar 19 17:37:22 crc kubenswrapper[4918]: I0319 17:37:22.674378 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rnpvv" podUID="09d75ec5-28d1-45c4-bc97-713d00455796" containerName="registry-server" containerID="cri-o://ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c" gracePeriod=2 Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.207269 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.274417 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-utilities\") pod \"09d75ec5-28d1-45c4-bc97-713d00455796\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.274492 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-catalog-content\") pod \"09d75ec5-28d1-45c4-bc97-713d00455796\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.274603 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mwvr\" (UniqueName: \"kubernetes.io/projected/09d75ec5-28d1-45c4-bc97-713d00455796-kube-api-access-6mwvr\") pod \"09d75ec5-28d1-45c4-bc97-713d00455796\" (UID: \"09d75ec5-28d1-45c4-bc97-713d00455796\") " Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.275372 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-utilities" (OuterVolumeSpecName: "utilities") pod "09d75ec5-28d1-45c4-bc97-713d00455796" (UID: "09d75ec5-28d1-45c4-bc97-713d00455796"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.284812 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d75ec5-28d1-45c4-bc97-713d00455796-kube-api-access-6mwvr" (OuterVolumeSpecName: "kube-api-access-6mwvr") pod "09d75ec5-28d1-45c4-bc97-713d00455796" (UID: "09d75ec5-28d1-45c4-bc97-713d00455796"). InnerVolumeSpecName "kube-api-access-6mwvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.330791 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09d75ec5-28d1-45c4-bc97-713d00455796" (UID: "09d75ec5-28d1-45c4-bc97-713d00455796"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.377241 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.377283 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d75ec5-28d1-45c4-bc97-713d00455796-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.377297 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mwvr\" (UniqueName: \"kubernetes.io/projected/09d75ec5-28d1-45c4-bc97-713d00455796-kube-api-access-6mwvr\") on node \"crc\" DevicePath \"\"" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.685986 4918 generic.go:334] "Generic (PLEG): container finished" podID="09d75ec5-28d1-45c4-bc97-713d00455796" containerID="ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c" exitCode=0 Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.686035 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnpvv" event={"ID":"09d75ec5-28d1-45c4-bc97-713d00455796","Type":"ContainerDied","Data":"ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c"} Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.686059 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rnpvv" event={"ID":"09d75ec5-28d1-45c4-bc97-713d00455796","Type":"ContainerDied","Data":"ba6fdc00801ad3f0a73579b348396b6a011daed73222d343d2a83a221f3e0399"} Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.686075 4918 scope.go:117] "RemoveContainer" containerID="ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.686197 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rnpvv" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.723166 4918 scope.go:117] "RemoveContainer" containerID="17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.728945 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rnpvv"] Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.739407 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rnpvv"] Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.754749 4918 scope.go:117] "RemoveContainer" containerID="da062de9b2ad5bf9b29d2e2625387fc50694c51054f2bc32f1520c3409b7af2f" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.843271 4918 scope.go:117] "RemoveContainer" containerID="ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c" Mar 19 17:37:23 crc kubenswrapper[4918]: E0319 17:37:23.843700 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c\": container with ID starting with ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c not found: ID does not exist" containerID="ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.843729 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c"} err="failed to get container status \"ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c\": rpc error: code = NotFound desc = could not find container \"ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c\": container with ID starting with ba5704a7177ee2b59ec8ba737dbdcab19004ea24fc0749fc4d93f4f81707070c not found: ID does not exist" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.843749 4918 scope.go:117] "RemoveContainer" containerID="17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3" Mar 19 17:37:23 crc kubenswrapper[4918]: E0319 17:37:23.843989 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3\": container with ID starting with 17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3 not found: ID does not exist" containerID="17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.844012 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3"} err="failed to get container status \"17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3\": rpc error: code = NotFound desc = could not find container \"17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3\": container with ID starting with 17f8222da235f810f7a60607c8bc022c92931a27233feb2985f11ad9a9cf45f3 not found: ID does not exist" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.844025 4918 scope.go:117] "RemoveContainer" containerID="da062de9b2ad5bf9b29d2e2625387fc50694c51054f2bc32f1520c3409b7af2f" Mar 19 17:37:23 crc kubenswrapper[4918]: E0319 17:37:23.844194 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da062de9b2ad5bf9b29d2e2625387fc50694c51054f2bc32f1520c3409b7af2f\": container with ID starting with da062de9b2ad5bf9b29d2e2625387fc50694c51054f2bc32f1520c3409b7af2f not found: ID does not exist" containerID="da062de9b2ad5bf9b29d2e2625387fc50694c51054f2bc32f1520c3409b7af2f" Mar 19 17:37:23 crc kubenswrapper[4918]: I0319 17:37:23.844208 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da062de9b2ad5bf9b29d2e2625387fc50694c51054f2bc32f1520c3409b7af2f"} err="failed to get container status \"da062de9b2ad5bf9b29d2e2625387fc50694c51054f2bc32f1520c3409b7af2f\": rpc error: code = NotFound desc = could not find container \"da062de9b2ad5bf9b29d2e2625387fc50694c51054f2bc32f1520c3409b7af2f\": container with ID starting with da062de9b2ad5bf9b29d2e2625387fc50694c51054f2bc32f1520c3409b7af2f not found: ID does not exist" Mar 19 17:37:24 crc kubenswrapper[4918]: I0319 17:37:24.601394 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d75ec5-28d1-45c4-bc97-713d00455796" path="/var/lib/kubelet/pods/09d75ec5-28d1-45c4-bc97-713d00455796/volumes" Mar 19 17:37:28 crc kubenswrapper[4918]: I0319 17:37:28.212282 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:37:28 crc kubenswrapper[4918]: I0319 17:37:28.213720 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:37:58 crc kubenswrapper[4918]: I0319 17:37:58.212194 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:37:58 crc kubenswrapper[4918]: I0319 17:37:58.212802 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:37:58 crc kubenswrapper[4918]: I0319 17:37:58.212849 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 17:37:58 crc kubenswrapper[4918]: I0319 17:37:58.213627 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cdaeef138643fe84efa20e700cf0b17551d48acbf4e559a1c2caea3bc9ce1b8c"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 17:37:58 crc kubenswrapper[4918]: I0319 17:37:58.213682 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://cdaeef138643fe84efa20e700cf0b17551d48acbf4e559a1c2caea3bc9ce1b8c" gracePeriod=600 Mar 19 17:37:59 crc kubenswrapper[4918]: I0319 17:37:59.149099 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="cdaeef138643fe84efa20e700cf0b17551d48acbf4e559a1c2caea3bc9ce1b8c" exitCode=0 Mar 19 17:37:59 crc kubenswrapper[4918]: I0319 17:37:59.149197 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"cdaeef138643fe84efa20e700cf0b17551d48acbf4e559a1c2caea3bc9ce1b8c"} Mar 19 17:37:59 crc kubenswrapper[4918]: I0319 17:37:59.149732 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e"} Mar 19 17:37:59 crc kubenswrapper[4918]: I0319 17:37:59.149758 4918 scope.go:117] "RemoveContainer" containerID="63847871d471c0d85eb375f242ac0287a7d4b4a46309409adf6081f45e918783" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.170039 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565698-ggcvv"] Mar 19 17:38:00 crc kubenswrapper[4918]: E0319 17:38:00.171006 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d75ec5-28d1-45c4-bc97-713d00455796" containerName="extract-utilities" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.171023 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d75ec5-28d1-45c4-bc97-713d00455796" containerName="extract-utilities" Mar 19 17:38:00 crc kubenswrapper[4918]: E0319 17:38:00.171051 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d75ec5-28d1-45c4-bc97-713d00455796" containerName="registry-server" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.171059 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d75ec5-28d1-45c4-bc97-713d00455796" containerName="registry-server" Mar 19 17:38:00 crc kubenswrapper[4918]: E0319 17:38:00.171075 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d75ec5-28d1-45c4-bc97-713d00455796" containerName="extract-content" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.171084 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d75ec5-28d1-45c4-bc97-713d00455796" containerName="extract-content" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.171338 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d75ec5-28d1-45c4-bc97-713d00455796" containerName="registry-server" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.172464 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565698-ggcvv" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.175177 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.175336 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.175499 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.192086 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565698-ggcvv"] Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.238742 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2959\" (UniqueName: \"kubernetes.io/projected/6002dd4d-9aa7-4557-bd98-4e1505b04153-kube-api-access-p2959\") pod \"auto-csr-approver-29565698-ggcvv\" (UID: \"6002dd4d-9aa7-4557-bd98-4e1505b04153\") " pod="openshift-infra/auto-csr-approver-29565698-ggcvv" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.341606 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2959\" (UniqueName: \"kubernetes.io/projected/6002dd4d-9aa7-4557-bd98-4e1505b04153-kube-api-access-p2959\") pod \"auto-csr-approver-29565698-ggcvv\" (UID: \"6002dd4d-9aa7-4557-bd98-4e1505b04153\") " pod="openshift-infra/auto-csr-approver-29565698-ggcvv" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.365448 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2959\" (UniqueName: \"kubernetes.io/projected/6002dd4d-9aa7-4557-bd98-4e1505b04153-kube-api-access-p2959\") pod \"auto-csr-approver-29565698-ggcvv\" (UID: \"6002dd4d-9aa7-4557-bd98-4e1505b04153\") " pod="openshift-infra/auto-csr-approver-29565698-ggcvv" Mar 19 17:38:00 crc kubenswrapper[4918]: I0319 17:38:00.500919 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565698-ggcvv" Mar 19 17:38:01 crc kubenswrapper[4918]: I0319 17:38:01.003114 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565698-ggcvv"] Mar 19 17:38:01 crc kubenswrapper[4918]: I0319 17:38:01.171999 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565698-ggcvv" event={"ID":"6002dd4d-9aa7-4557-bd98-4e1505b04153","Type":"ContainerStarted","Data":"614706f4b18069930cd5eb0120429a91269ca9491ed7caafcef696183d64b39c"} Mar 19 17:38:03 crc kubenswrapper[4918]: I0319 17:38:03.196712 4918 generic.go:334] "Generic (PLEG): container finished" podID="6002dd4d-9aa7-4557-bd98-4e1505b04153" containerID="78107cb29b5030f1c8fd04cc666cd3bd96114cc22a4f3711549525d1fc5570e6" exitCode=0 Mar 19 17:38:03 crc kubenswrapper[4918]: I0319 17:38:03.196805 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565698-ggcvv" event={"ID":"6002dd4d-9aa7-4557-bd98-4e1505b04153","Type":"ContainerDied","Data":"78107cb29b5030f1c8fd04cc666cd3bd96114cc22a4f3711549525d1fc5570e6"} Mar 19 17:38:04 crc kubenswrapper[4918]: I0319 17:38:04.701239 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565698-ggcvv" Mar 19 17:38:04 crc kubenswrapper[4918]: I0319 17:38:04.741051 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2959\" (UniqueName: \"kubernetes.io/projected/6002dd4d-9aa7-4557-bd98-4e1505b04153-kube-api-access-p2959\") pod \"6002dd4d-9aa7-4557-bd98-4e1505b04153\" (UID: \"6002dd4d-9aa7-4557-bd98-4e1505b04153\") " Mar 19 17:38:04 crc kubenswrapper[4918]: I0319 17:38:04.759753 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6002dd4d-9aa7-4557-bd98-4e1505b04153-kube-api-access-p2959" (OuterVolumeSpecName: "kube-api-access-p2959") pod "6002dd4d-9aa7-4557-bd98-4e1505b04153" (UID: "6002dd4d-9aa7-4557-bd98-4e1505b04153"). InnerVolumeSpecName "kube-api-access-p2959". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:38:04 crc kubenswrapper[4918]: I0319 17:38:04.843031 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2959\" (UniqueName: \"kubernetes.io/projected/6002dd4d-9aa7-4557-bd98-4e1505b04153-kube-api-access-p2959\") on node \"crc\" DevicePath \"\"" Mar 19 17:38:05 crc kubenswrapper[4918]: I0319 17:38:05.219464 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565698-ggcvv" event={"ID":"6002dd4d-9aa7-4557-bd98-4e1505b04153","Type":"ContainerDied","Data":"614706f4b18069930cd5eb0120429a91269ca9491ed7caafcef696183d64b39c"} Mar 19 17:38:05 crc kubenswrapper[4918]: I0319 17:38:05.219501 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="614706f4b18069930cd5eb0120429a91269ca9491ed7caafcef696183d64b39c" Mar 19 17:38:05 crc kubenswrapper[4918]: I0319 17:38:05.219556 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565698-ggcvv" Mar 19 17:38:05 crc kubenswrapper[4918]: I0319 17:38:05.823474 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565692-hg9vg"] Mar 19 17:38:05 crc kubenswrapper[4918]: I0319 17:38:05.832195 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565692-hg9vg"] Mar 19 17:38:06 crc kubenswrapper[4918]: I0319 17:38:06.605011 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="123d818a-f108-47cd-bb91-152d65a70514" path="/var/lib/kubelet/pods/123d818a-f108-47cd-bb91-152d65a70514/volumes" Mar 19 17:38:07 crc kubenswrapper[4918]: I0319 17:38:07.015424 4918 scope.go:117] "RemoveContainer" containerID="0a92ba7fe9ee9786481bdeb87bebc028700dc93cd0913e462e7e6779e8703039" Mar 19 17:39:58 crc kubenswrapper[4918]: I0319 17:39:58.211926 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:39:58 crc kubenswrapper[4918]: I0319 17:39:58.212602 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.161962 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565700-kvkq9"] Mar 19 17:40:00 crc kubenswrapper[4918]: E0319 17:40:00.162466 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6002dd4d-9aa7-4557-bd98-4e1505b04153" containerName="oc" Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.162484 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6002dd4d-9aa7-4557-bd98-4e1505b04153" containerName="oc" Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.162765 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="6002dd4d-9aa7-4557-bd98-4e1505b04153" containerName="oc" Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.163732 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565700-kvkq9" Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.168590 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.168861 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.169200 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.180937 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565700-kvkq9"] Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.278301 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdlp7\" (UniqueName: \"kubernetes.io/projected/8356a087-24b9-42c3-8ce1-7513399baf96-kube-api-access-wdlp7\") pod \"auto-csr-approver-29565700-kvkq9\" (UID: \"8356a087-24b9-42c3-8ce1-7513399baf96\") " pod="openshift-infra/auto-csr-approver-29565700-kvkq9" Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.381734 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdlp7\" (UniqueName: \"kubernetes.io/projected/8356a087-24b9-42c3-8ce1-7513399baf96-kube-api-access-wdlp7\") pod \"auto-csr-approver-29565700-kvkq9\" (UID: \"8356a087-24b9-42c3-8ce1-7513399baf96\") " pod="openshift-infra/auto-csr-approver-29565700-kvkq9" Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.403124 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdlp7\" (UniqueName: \"kubernetes.io/projected/8356a087-24b9-42c3-8ce1-7513399baf96-kube-api-access-wdlp7\") pod \"auto-csr-approver-29565700-kvkq9\" (UID: \"8356a087-24b9-42c3-8ce1-7513399baf96\") " pod="openshift-infra/auto-csr-approver-29565700-kvkq9" Mar 19 17:40:00 crc kubenswrapper[4918]: I0319 17:40:00.486393 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565700-kvkq9" Mar 19 17:40:01 crc kubenswrapper[4918]: I0319 17:40:01.038149 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565700-kvkq9"] Mar 19 17:40:01 crc kubenswrapper[4918]: I0319 17:40:01.599435 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565700-kvkq9" event={"ID":"8356a087-24b9-42c3-8ce1-7513399baf96","Type":"ContainerStarted","Data":"009ace39947aeff438fb006390c4edb8ad15ac586c501aa146b0d726a8765b6d"} Mar 19 17:40:03 crc kubenswrapper[4918]: I0319 17:40:03.623562 4918 generic.go:334] "Generic (PLEG): container finished" podID="8356a087-24b9-42c3-8ce1-7513399baf96" containerID="e748dd98e92e05cd699650d27df5a4aa5598f3fee477657aec434ae8878a80e6" exitCode=0 Mar 19 17:40:03 crc kubenswrapper[4918]: I0319 17:40:03.623621 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565700-kvkq9" event={"ID":"8356a087-24b9-42c3-8ce1-7513399baf96","Type":"ContainerDied","Data":"e748dd98e92e05cd699650d27df5a4aa5598f3fee477657aec434ae8878a80e6"} Mar 19 17:40:05 crc kubenswrapper[4918]: I0319 17:40:05.031508 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565700-kvkq9" Mar 19 17:40:05 crc kubenswrapper[4918]: I0319 17:40:05.086262 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdlp7\" (UniqueName: \"kubernetes.io/projected/8356a087-24b9-42c3-8ce1-7513399baf96-kube-api-access-wdlp7\") pod \"8356a087-24b9-42c3-8ce1-7513399baf96\" (UID: \"8356a087-24b9-42c3-8ce1-7513399baf96\") " Mar 19 17:40:05 crc kubenswrapper[4918]: I0319 17:40:05.096945 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8356a087-24b9-42c3-8ce1-7513399baf96-kube-api-access-wdlp7" (OuterVolumeSpecName: "kube-api-access-wdlp7") pod "8356a087-24b9-42c3-8ce1-7513399baf96" (UID: "8356a087-24b9-42c3-8ce1-7513399baf96"). InnerVolumeSpecName "kube-api-access-wdlp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:40:05 crc kubenswrapper[4918]: I0319 17:40:05.190091 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdlp7\" (UniqueName: \"kubernetes.io/projected/8356a087-24b9-42c3-8ce1-7513399baf96-kube-api-access-wdlp7\") on node \"crc\" DevicePath \"\"" Mar 19 17:40:05 crc kubenswrapper[4918]: I0319 17:40:05.648430 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565700-kvkq9" event={"ID":"8356a087-24b9-42c3-8ce1-7513399baf96","Type":"ContainerDied","Data":"009ace39947aeff438fb006390c4edb8ad15ac586c501aa146b0d726a8765b6d"} Mar 19 17:40:05 crc kubenswrapper[4918]: I0319 17:40:05.648474 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="009ace39947aeff438fb006390c4edb8ad15ac586c501aa146b0d726a8765b6d" Mar 19 17:40:05 crc kubenswrapper[4918]: I0319 17:40:05.648660 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565700-kvkq9" Mar 19 17:40:06 crc kubenswrapper[4918]: I0319 17:40:06.108950 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565694-zn7kn"] Mar 19 17:40:06 crc kubenswrapper[4918]: I0319 17:40:06.118465 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565694-zn7kn"] Mar 19 17:40:06 crc kubenswrapper[4918]: I0319 17:40:06.600613 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc1c9cf5-7b95-4520-966b-def88bbe834e" path="/var/lib/kubelet/pods/cc1c9cf5-7b95-4520-966b-def88bbe834e/volumes" Mar 19 17:40:07 crc kubenswrapper[4918]: I0319 17:40:07.115218 4918 scope.go:117] "RemoveContainer" containerID="f7660c1f110672d36792a552a2311c65d06c017664982e983e892828a1a2e22b" Mar 19 17:40:28 crc kubenswrapper[4918]: I0319 17:40:28.211847 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:40:28 crc kubenswrapper[4918]: I0319 17:40:28.212622 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:40:58 crc kubenswrapper[4918]: I0319 17:40:58.211806 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:40:58 crc kubenswrapper[4918]: I0319 17:40:58.212407 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:40:58 crc kubenswrapper[4918]: I0319 17:40:58.212483 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 17:40:58 crc kubenswrapper[4918]: I0319 17:40:58.213790 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 17:40:58 crc kubenswrapper[4918]: I0319 17:40:58.213899 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" gracePeriod=600 Mar 19 17:40:58 crc kubenswrapper[4918]: E0319 17:40:58.341349 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:40:59 crc kubenswrapper[4918]: I0319 17:40:59.239436 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" exitCode=0 Mar 19 17:40:59 crc kubenswrapper[4918]: I0319 17:40:59.239556 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e"} Mar 19 17:40:59 crc kubenswrapper[4918]: I0319 17:40:59.239799 4918 scope.go:117] "RemoveContainer" containerID="cdaeef138643fe84efa20e700cf0b17551d48acbf4e559a1c2caea3bc9ce1b8c" Mar 19 17:40:59 crc kubenswrapper[4918]: I0319 17:40:59.241005 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:40:59 crc kubenswrapper[4918]: E0319 17:40:59.241817 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:41:11 crc kubenswrapper[4918]: I0319 17:41:11.586518 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:41:11 crc kubenswrapper[4918]: E0319 17:41:11.587607 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.435392 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l9czq"] Mar 19 17:41:24 crc kubenswrapper[4918]: E0319 17:41:24.436230 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8356a087-24b9-42c3-8ce1-7513399baf96" containerName="oc" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.436244 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="8356a087-24b9-42c3-8ce1-7513399baf96" containerName="oc" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.436497 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="8356a087-24b9-42c3-8ce1-7513399baf96" containerName="oc" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.439001 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.502717 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9czq"] Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.565929 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-utilities\") pod \"redhat-operators-l9czq\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.565982 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-catalog-content\") pod \"redhat-operators-l9czq\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.566451 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4g5z\" (UniqueName: \"kubernetes.io/projected/ad1743e2-73ec-489f-a110-4e0ce727e82a-kube-api-access-v4g5z\") pod \"redhat-operators-l9czq\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.667899 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4g5z\" (UniqueName: \"kubernetes.io/projected/ad1743e2-73ec-489f-a110-4e0ce727e82a-kube-api-access-v4g5z\") pod \"redhat-operators-l9czq\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.667972 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-utilities\") pod \"redhat-operators-l9czq\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.668000 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-catalog-content\") pod \"redhat-operators-l9czq\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.668849 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-catalog-content\") pod \"redhat-operators-l9czq\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.668949 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-utilities\") pod \"redhat-operators-l9czq\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.693938 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4g5z\" (UniqueName: \"kubernetes.io/projected/ad1743e2-73ec-489f-a110-4e0ce727e82a-kube-api-access-v4g5z\") pod \"redhat-operators-l9czq\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:24 crc kubenswrapper[4918]: I0319 17:41:24.788322 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:25 crc kubenswrapper[4918]: I0319 17:41:25.297993 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9czq"] Mar 19 17:41:25 crc kubenswrapper[4918]: I0319 17:41:25.575759 4918 generic.go:334] "Generic (PLEG): container finished" podID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerID="ffc93d39ddaaa29285305c5fe51eff1bd124624e164314c5ed4033290cfb50fa" exitCode=0 Mar 19 17:41:25 crc kubenswrapper[4918]: I0319 17:41:25.575802 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9czq" event={"ID":"ad1743e2-73ec-489f-a110-4e0ce727e82a","Type":"ContainerDied","Data":"ffc93d39ddaaa29285305c5fe51eff1bd124624e164314c5ed4033290cfb50fa"} Mar 19 17:41:25 crc kubenswrapper[4918]: I0319 17:41:25.575829 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9czq" event={"ID":"ad1743e2-73ec-489f-a110-4e0ce727e82a","Type":"ContainerStarted","Data":"eb9deed002b41f746a2e10d305e095e38db46218d57157aba497b916ff71008b"} Mar 19 17:41:25 crc kubenswrapper[4918]: I0319 17:41:25.578596 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 17:41:26 crc kubenswrapper[4918]: I0319 17:41:26.586855 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:41:26 crc kubenswrapper[4918]: E0319 17:41:26.587494 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:41:26 crc kubenswrapper[4918]: I0319 17:41:26.600010 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9czq" event={"ID":"ad1743e2-73ec-489f-a110-4e0ce727e82a","Type":"ContainerStarted","Data":"09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4"} Mar 19 17:41:30 crc kubenswrapper[4918]: I0319 17:41:30.647850 4918 generic.go:334] "Generic (PLEG): container finished" podID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerID="09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4" exitCode=0 Mar 19 17:41:30 crc kubenswrapper[4918]: I0319 17:41:30.647881 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9czq" event={"ID":"ad1743e2-73ec-489f-a110-4e0ce727e82a","Type":"ContainerDied","Data":"09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4"} Mar 19 17:41:31 crc kubenswrapper[4918]: I0319 17:41:31.661832 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9czq" event={"ID":"ad1743e2-73ec-489f-a110-4e0ce727e82a","Type":"ContainerStarted","Data":"39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758"} Mar 19 17:41:31 crc kubenswrapper[4918]: I0319 17:41:31.687337 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l9czq" podStartSLOduration=2.180608711 podStartE2EDuration="7.68730171s" podCreationTimestamp="2026-03-19 17:41:24 +0000 UTC" firstStartedPulling="2026-03-19 17:41:25.578286128 +0000 UTC m=+3697.700485376" lastFinishedPulling="2026-03-19 17:41:31.084979117 +0000 UTC m=+3703.207178375" observedRunningTime="2026-03-19 17:41:31.686662902 +0000 UTC m=+3703.808862190" watchObservedRunningTime="2026-03-19 17:41:31.68730171 +0000 UTC m=+3703.809500988" Mar 19 17:41:34 crc kubenswrapper[4918]: I0319 17:41:34.788594 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:34 crc kubenswrapper[4918]: I0319 17:41:34.789084 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:35 crc kubenswrapper[4918]: I0319 17:41:35.838369 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l9czq" podUID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerName="registry-server" probeResult="failure" output=< Mar 19 17:41:35 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 17:41:35 crc kubenswrapper[4918]: > Mar 19 17:41:40 crc kubenswrapper[4918]: I0319 17:41:40.586926 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:41:40 crc kubenswrapper[4918]: E0319 17:41:40.589035 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:41:45 crc kubenswrapper[4918]: I0319 17:41:45.864690 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l9czq" podUID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerName="registry-server" probeResult="failure" output=< Mar 19 17:41:45 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 17:41:45 crc kubenswrapper[4918]: > Mar 19 17:41:54 crc kubenswrapper[4918]: I0319 17:41:54.587664 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:41:54 crc kubenswrapper[4918]: E0319 17:41:54.590465 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:41:54 crc kubenswrapper[4918]: I0319 17:41:54.871672 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:54 crc kubenswrapper[4918]: I0319 17:41:54.979023 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:55 crc kubenswrapper[4918]: I0319 17:41:55.637513 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9czq"] Mar 19 17:41:55 crc kubenswrapper[4918]: I0319 17:41:55.994593 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l9czq" podUID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerName="registry-server" containerID="cri-o://39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758" gracePeriod=2 Mar 19 17:41:56 crc kubenswrapper[4918]: I0319 17:41:56.550703 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:56 crc kubenswrapper[4918]: I0319 17:41:56.628027 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-catalog-content\") pod \"ad1743e2-73ec-489f-a110-4e0ce727e82a\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " Mar 19 17:41:56 crc kubenswrapper[4918]: I0319 17:41:56.628301 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-utilities\") pod \"ad1743e2-73ec-489f-a110-4e0ce727e82a\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " Mar 19 17:41:56 crc kubenswrapper[4918]: I0319 17:41:56.628402 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4g5z\" (UniqueName: \"kubernetes.io/projected/ad1743e2-73ec-489f-a110-4e0ce727e82a-kube-api-access-v4g5z\") pod \"ad1743e2-73ec-489f-a110-4e0ce727e82a\" (UID: \"ad1743e2-73ec-489f-a110-4e0ce727e82a\") " Mar 19 17:41:56 crc kubenswrapper[4918]: I0319 17:41:56.629372 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-utilities" (OuterVolumeSpecName: "utilities") pod "ad1743e2-73ec-489f-a110-4e0ce727e82a" (UID: "ad1743e2-73ec-489f-a110-4e0ce727e82a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:41:56 crc kubenswrapper[4918]: I0319 17:41:56.637922 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad1743e2-73ec-489f-a110-4e0ce727e82a-kube-api-access-v4g5z" (OuterVolumeSpecName: "kube-api-access-v4g5z") pod "ad1743e2-73ec-489f-a110-4e0ce727e82a" (UID: "ad1743e2-73ec-489f-a110-4e0ce727e82a"). InnerVolumeSpecName "kube-api-access-v4g5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:41:56 crc kubenswrapper[4918]: I0319 17:41:56.731033 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:41:56 crc kubenswrapper[4918]: I0319 17:41:56.731066 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4g5z\" (UniqueName: \"kubernetes.io/projected/ad1743e2-73ec-489f-a110-4e0ce727e82a-kube-api-access-v4g5z\") on node \"crc\" DevicePath \"\"" Mar 19 17:41:56 crc kubenswrapper[4918]: I0319 17:41:56.766031 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad1743e2-73ec-489f-a110-4e0ce727e82a" (UID: "ad1743e2-73ec-489f-a110-4e0ce727e82a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:41:56 crc kubenswrapper[4918]: I0319 17:41:56.832688 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad1743e2-73ec-489f-a110-4e0ce727e82a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.015615 4918 generic.go:334] "Generic (PLEG): container finished" podID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerID="39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758" exitCode=0 Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.015719 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9czq" Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.015699 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9czq" event={"ID":"ad1743e2-73ec-489f-a110-4e0ce727e82a","Type":"ContainerDied","Data":"39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758"} Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.015790 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9czq" event={"ID":"ad1743e2-73ec-489f-a110-4e0ce727e82a","Type":"ContainerDied","Data":"eb9deed002b41f746a2e10d305e095e38db46218d57157aba497b916ff71008b"} Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.015831 4918 scope.go:117] "RemoveContainer" containerID="39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758" Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.053338 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9czq"] Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.056365 4918 scope.go:117] "RemoveContainer" containerID="09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4" Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.062097 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l9czq"] Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.085166 4918 scope.go:117] "RemoveContainer" containerID="ffc93d39ddaaa29285305c5fe51eff1bd124624e164314c5ed4033290cfb50fa" Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.132685 4918 scope.go:117] "RemoveContainer" containerID="39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758" Mar 19 17:41:57 crc kubenswrapper[4918]: E0319 17:41:57.133262 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758\": container with ID starting with 39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758 not found: ID does not exist" containerID="39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758" Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.133313 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758"} err="failed to get container status \"39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758\": rpc error: code = NotFound desc = could not find container \"39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758\": container with ID starting with 39d57c39343507f80d8bde369239c289335fcafb76c76f0692716a4aa2dd0758 not found: ID does not exist" Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.133345 4918 scope.go:117] "RemoveContainer" containerID="09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4" Mar 19 17:41:57 crc kubenswrapper[4918]: E0319 17:41:57.134235 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4\": container with ID starting with 09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4 not found: ID does not exist" containerID="09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4" Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.134308 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4"} err="failed to get container status \"09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4\": rpc error: code = NotFound desc = could not find container \"09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4\": container with ID starting with 09606de2de80fe62dc49092b84074d9f1bee557d67949d96144fe1ae6a7da9c4 not found: ID does not exist" Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.134340 4918 scope.go:117] "RemoveContainer" containerID="ffc93d39ddaaa29285305c5fe51eff1bd124624e164314c5ed4033290cfb50fa" Mar 19 17:41:57 crc kubenswrapper[4918]: E0319 17:41:57.134872 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc93d39ddaaa29285305c5fe51eff1bd124624e164314c5ed4033290cfb50fa\": container with ID starting with ffc93d39ddaaa29285305c5fe51eff1bd124624e164314c5ed4033290cfb50fa not found: ID does not exist" containerID="ffc93d39ddaaa29285305c5fe51eff1bd124624e164314c5ed4033290cfb50fa" Mar 19 17:41:57 crc kubenswrapper[4918]: I0319 17:41:57.134899 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc93d39ddaaa29285305c5fe51eff1bd124624e164314c5ed4033290cfb50fa"} err="failed to get container status \"ffc93d39ddaaa29285305c5fe51eff1bd124624e164314c5ed4033290cfb50fa\": rpc error: code = NotFound desc = could not find container \"ffc93d39ddaaa29285305c5fe51eff1bd124624e164314c5ed4033290cfb50fa\": container with ID starting with ffc93d39ddaaa29285305c5fe51eff1bd124624e164314c5ed4033290cfb50fa not found: ID does not exist" Mar 19 17:41:57 crc kubenswrapper[4918]: E0319 17:41:57.208454 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad1743e2_73ec_489f_a110_4e0ce727e82a.slice/crio-eb9deed002b41f746a2e10d305e095e38db46218d57157aba497b916ff71008b\": RecentStats: unable to find data in memory cache]" Mar 19 17:41:58 crc kubenswrapper[4918]: I0319 17:41:58.610921 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad1743e2-73ec-489f-a110-4e0ce727e82a" path="/var/lib/kubelet/pods/ad1743e2-73ec-489f-a110-4e0ce727e82a/volumes" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.294170 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565702-bs44m"] Mar 19 17:42:00 crc kubenswrapper[4918]: E0319 17:42:00.295261 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerName="registry-server" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.295287 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerName="registry-server" Mar 19 17:42:00 crc kubenswrapper[4918]: E0319 17:42:00.295323 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerName="extract-utilities" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.295338 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerName="extract-utilities" Mar 19 17:42:00 crc kubenswrapper[4918]: E0319 17:42:00.295382 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerName="extract-content" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.295426 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerName="extract-content" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.296022 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad1743e2-73ec-489f-a110-4e0ce727e82a" containerName="registry-server" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.297431 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565702-bs44m" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.300889 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.302626 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.302935 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.312762 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565702-bs44m"] Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.488651 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hf5s\" (UniqueName: \"kubernetes.io/projected/0a54cf28-1fdf-4640-a48d-16ecfe834e38-kube-api-access-9hf5s\") pod \"auto-csr-approver-29565702-bs44m\" (UID: \"0a54cf28-1fdf-4640-a48d-16ecfe834e38\") " pod="openshift-infra/auto-csr-approver-29565702-bs44m" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.591291 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hf5s\" (UniqueName: \"kubernetes.io/projected/0a54cf28-1fdf-4640-a48d-16ecfe834e38-kube-api-access-9hf5s\") pod \"auto-csr-approver-29565702-bs44m\" (UID: \"0a54cf28-1fdf-4640-a48d-16ecfe834e38\") " pod="openshift-infra/auto-csr-approver-29565702-bs44m" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.623846 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hf5s\" (UniqueName: \"kubernetes.io/projected/0a54cf28-1fdf-4640-a48d-16ecfe834e38-kube-api-access-9hf5s\") pod \"auto-csr-approver-29565702-bs44m\" (UID: \"0a54cf28-1fdf-4640-a48d-16ecfe834e38\") " pod="openshift-infra/auto-csr-approver-29565702-bs44m" Mar 19 17:42:00 crc kubenswrapper[4918]: I0319 17:42:00.638977 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565702-bs44m" Mar 19 17:42:01 crc kubenswrapper[4918]: I0319 17:42:01.122497 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565702-bs44m"] Mar 19 17:42:01 crc kubenswrapper[4918]: W0319 17:42:01.129858 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a54cf28_1fdf_4640_a48d_16ecfe834e38.slice/crio-39a922c15c37b1bf11a13c2702fd522ca09bab0dafaea5294776a29e7cee927d WatchSource:0}: Error finding container 39a922c15c37b1bf11a13c2702fd522ca09bab0dafaea5294776a29e7cee927d: Status 404 returned error can't find the container with id 39a922c15c37b1bf11a13c2702fd522ca09bab0dafaea5294776a29e7cee927d Mar 19 17:42:02 crc kubenswrapper[4918]: I0319 17:42:02.076600 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565702-bs44m" event={"ID":"0a54cf28-1fdf-4640-a48d-16ecfe834e38","Type":"ContainerStarted","Data":"39a922c15c37b1bf11a13c2702fd522ca09bab0dafaea5294776a29e7cee927d"} Mar 19 17:42:03 crc kubenswrapper[4918]: I0319 17:42:03.091782 4918 generic.go:334] "Generic (PLEG): container finished" podID="0a54cf28-1fdf-4640-a48d-16ecfe834e38" containerID="123733cdca667a2cd1384999e1a0daae27b9ceac075cfbfb9dd26382dad1f7a9" exitCode=0 Mar 19 17:42:03 crc kubenswrapper[4918]: I0319 17:42:03.092051 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565702-bs44m" event={"ID":"0a54cf28-1fdf-4640-a48d-16ecfe834e38","Type":"ContainerDied","Data":"123733cdca667a2cd1384999e1a0daae27b9ceac075cfbfb9dd26382dad1f7a9"} Mar 19 17:42:04 crc kubenswrapper[4918]: I0319 17:42:04.534764 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565702-bs44m" Mar 19 17:42:04 crc kubenswrapper[4918]: I0319 17:42:04.681191 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hf5s\" (UniqueName: \"kubernetes.io/projected/0a54cf28-1fdf-4640-a48d-16ecfe834e38-kube-api-access-9hf5s\") pod \"0a54cf28-1fdf-4640-a48d-16ecfe834e38\" (UID: \"0a54cf28-1fdf-4640-a48d-16ecfe834e38\") " Mar 19 17:42:04 crc kubenswrapper[4918]: I0319 17:42:04.686619 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a54cf28-1fdf-4640-a48d-16ecfe834e38-kube-api-access-9hf5s" (OuterVolumeSpecName: "kube-api-access-9hf5s") pod "0a54cf28-1fdf-4640-a48d-16ecfe834e38" (UID: "0a54cf28-1fdf-4640-a48d-16ecfe834e38"). InnerVolumeSpecName "kube-api-access-9hf5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:42:04 crc kubenswrapper[4918]: I0319 17:42:04.783921 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hf5s\" (UniqueName: \"kubernetes.io/projected/0a54cf28-1fdf-4640-a48d-16ecfe834e38-kube-api-access-9hf5s\") on node \"crc\" DevicePath \"\"" Mar 19 17:42:05 crc kubenswrapper[4918]: I0319 17:42:05.115987 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565702-bs44m" event={"ID":"0a54cf28-1fdf-4640-a48d-16ecfe834e38","Type":"ContainerDied","Data":"39a922c15c37b1bf11a13c2702fd522ca09bab0dafaea5294776a29e7cee927d"} Mar 19 17:42:05 crc kubenswrapper[4918]: I0319 17:42:05.116030 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39a922c15c37b1bf11a13c2702fd522ca09bab0dafaea5294776a29e7cee927d" Mar 19 17:42:05 crc kubenswrapper[4918]: I0319 17:42:05.116064 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565702-bs44m" Mar 19 17:42:05 crc kubenswrapper[4918]: I0319 17:42:05.621509 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565696-ppctl"] Mar 19 17:42:05 crc kubenswrapper[4918]: I0319 17:42:05.633940 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565696-ppctl"] Mar 19 17:42:06 crc kubenswrapper[4918]: I0319 17:42:06.587088 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:42:06 crc kubenswrapper[4918]: E0319 17:42:06.588186 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:42:06 crc kubenswrapper[4918]: I0319 17:42:06.603674 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf116de-e4ab-4043-818c-b904883644f2" path="/var/lib/kubelet/pods/ddf116de-e4ab-4043-818c-b904883644f2/volumes" Mar 19 17:42:07 crc kubenswrapper[4918]: I0319 17:42:07.223476 4918 scope.go:117] "RemoveContainer" containerID="bb134bec25153a1bb8ab3784512ebc2401a9e748fbc5da9e1fff43f403fd4e36" Mar 19 17:42:18 crc kubenswrapper[4918]: I0319 17:42:18.600921 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:42:18 crc kubenswrapper[4918]: E0319 17:42:18.601982 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:42:33 crc kubenswrapper[4918]: I0319 17:42:33.586727 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:42:33 crc kubenswrapper[4918]: E0319 17:42:33.587576 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:42:44 crc kubenswrapper[4918]: I0319 17:42:44.594624 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:42:44 crc kubenswrapper[4918]: E0319 17:42:44.596060 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:42:57 crc kubenswrapper[4918]: I0319 17:42:57.586385 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:42:57 crc kubenswrapper[4918]: E0319 17:42:57.587351 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:43:11 crc kubenswrapper[4918]: I0319 17:43:11.587105 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:43:11 crc kubenswrapper[4918]: E0319 17:43:11.587992 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:43:23 crc kubenswrapper[4918]: I0319 17:43:23.586966 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:43:23 crc kubenswrapper[4918]: E0319 17:43:23.588586 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:43:34 crc kubenswrapper[4918]: I0319 17:43:34.587330 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:43:34 crc kubenswrapper[4918]: E0319 17:43:34.588227 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:43:48 crc kubenswrapper[4918]: I0319 17:43:48.605287 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:43:48 crc kubenswrapper[4918]: E0319 17:43:48.606659 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.181549 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565704-b999m"] Mar 19 17:44:00 crc kubenswrapper[4918]: E0319 17:44:00.182645 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a54cf28-1fdf-4640-a48d-16ecfe834e38" containerName="oc" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.182661 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a54cf28-1fdf-4640-a48d-16ecfe834e38" containerName="oc" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.182903 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a54cf28-1fdf-4640-a48d-16ecfe834e38" containerName="oc" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.183968 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565704-b999m" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.188223 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.188229 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.195933 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.199435 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565704-b999m"] Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.239450 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9cp9\" (UniqueName: \"kubernetes.io/projected/7376d4b2-418d-4217-bd64-a46ff8030af0-kube-api-access-c9cp9\") pod \"auto-csr-approver-29565704-b999m\" (UID: \"7376d4b2-418d-4217-bd64-a46ff8030af0\") " pod="openshift-infra/auto-csr-approver-29565704-b999m" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.341012 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9cp9\" (UniqueName: \"kubernetes.io/projected/7376d4b2-418d-4217-bd64-a46ff8030af0-kube-api-access-c9cp9\") pod \"auto-csr-approver-29565704-b999m\" (UID: \"7376d4b2-418d-4217-bd64-a46ff8030af0\") " pod="openshift-infra/auto-csr-approver-29565704-b999m" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.367490 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9cp9\" (UniqueName: \"kubernetes.io/projected/7376d4b2-418d-4217-bd64-a46ff8030af0-kube-api-access-c9cp9\") pod \"auto-csr-approver-29565704-b999m\" (UID: \"7376d4b2-418d-4217-bd64-a46ff8030af0\") " pod="openshift-infra/auto-csr-approver-29565704-b999m" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.506434 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565704-b999m" Mar 19 17:44:00 crc kubenswrapper[4918]: I0319 17:44:00.977349 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565704-b999m"] Mar 19 17:44:01 crc kubenswrapper[4918]: I0319 17:44:01.529081 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565704-b999m" event={"ID":"7376d4b2-418d-4217-bd64-a46ff8030af0","Type":"ContainerStarted","Data":"e925c17689f07e116289599f4e64296e3327d14de6cdc490c5c39827929919b5"} Mar 19 17:44:02 crc kubenswrapper[4918]: I0319 17:44:02.588191 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:44:02 crc kubenswrapper[4918]: E0319 17:44:02.589134 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:44:03 crc kubenswrapper[4918]: I0319 17:44:03.555297 4918 generic.go:334] "Generic (PLEG): container finished" podID="7376d4b2-418d-4217-bd64-a46ff8030af0" containerID="2e64c96800e312884ac45848710462a078171eb84b0a130a3aea5e4c9a5e77d9" exitCode=0 Mar 19 17:44:03 crc kubenswrapper[4918]: I0319 17:44:03.555375 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565704-b999m" event={"ID":"7376d4b2-418d-4217-bd64-a46ff8030af0","Type":"ContainerDied","Data":"2e64c96800e312884ac45848710462a078171eb84b0a130a3aea5e4c9a5e77d9"} Mar 19 17:44:05 crc kubenswrapper[4918]: I0319 17:44:05.030358 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565704-b999m" Mar 19 17:44:05 crc kubenswrapper[4918]: I0319 17:44:05.191238 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9cp9\" (UniqueName: \"kubernetes.io/projected/7376d4b2-418d-4217-bd64-a46ff8030af0-kube-api-access-c9cp9\") pod \"7376d4b2-418d-4217-bd64-a46ff8030af0\" (UID: \"7376d4b2-418d-4217-bd64-a46ff8030af0\") " Mar 19 17:44:05 crc kubenswrapper[4918]: I0319 17:44:05.197586 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7376d4b2-418d-4217-bd64-a46ff8030af0-kube-api-access-c9cp9" (OuterVolumeSpecName: "kube-api-access-c9cp9") pod "7376d4b2-418d-4217-bd64-a46ff8030af0" (UID: "7376d4b2-418d-4217-bd64-a46ff8030af0"). InnerVolumeSpecName "kube-api-access-c9cp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:44:05 crc kubenswrapper[4918]: I0319 17:44:05.294544 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9cp9\" (UniqueName: \"kubernetes.io/projected/7376d4b2-418d-4217-bd64-a46ff8030af0-kube-api-access-c9cp9\") on node \"crc\" DevicePath \"\"" Mar 19 17:44:05 crc kubenswrapper[4918]: I0319 17:44:05.579782 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565704-b999m" event={"ID":"7376d4b2-418d-4217-bd64-a46ff8030af0","Type":"ContainerDied","Data":"e925c17689f07e116289599f4e64296e3327d14de6cdc490c5c39827929919b5"} Mar 19 17:44:05 crc kubenswrapper[4918]: I0319 17:44:05.579843 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565704-b999m" Mar 19 17:44:05 crc kubenswrapper[4918]: I0319 17:44:05.579847 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e925c17689f07e116289599f4e64296e3327d14de6cdc490c5c39827929919b5" Mar 19 17:44:06 crc kubenswrapper[4918]: I0319 17:44:06.137705 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565698-ggcvv"] Mar 19 17:44:06 crc kubenswrapper[4918]: I0319 17:44:06.145443 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565698-ggcvv"] Mar 19 17:44:06 crc kubenswrapper[4918]: I0319 17:44:06.606507 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6002dd4d-9aa7-4557-bd98-4e1505b04153" path="/var/lib/kubelet/pods/6002dd4d-9aa7-4557-bd98-4e1505b04153/volumes" Mar 19 17:44:07 crc kubenswrapper[4918]: I0319 17:44:07.361766 4918 scope.go:117] "RemoveContainer" containerID="78107cb29b5030f1c8fd04cc666cd3bd96114cc22a4f3711549525d1fc5570e6" Mar 19 17:44:13 crc kubenswrapper[4918]: I0319 17:44:13.587472 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:44:13 crc kubenswrapper[4918]: E0319 17:44:13.588838 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:44:27 crc kubenswrapper[4918]: I0319 17:44:27.586736 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:44:27 crc kubenswrapper[4918]: E0319 17:44:27.587697 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:44:39 crc kubenswrapper[4918]: I0319 17:44:39.588082 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:44:39 crc kubenswrapper[4918]: E0319 17:44:39.588900 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:44:51 crc kubenswrapper[4918]: I0319 17:44:51.585989 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:44:51 crc kubenswrapper[4918]: E0319 17:44:51.587134 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.166979 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l"] Mar 19 17:45:00 crc kubenswrapper[4918]: E0319 17:45:00.168418 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7376d4b2-418d-4217-bd64-a46ff8030af0" containerName="oc" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.168442 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7376d4b2-418d-4217-bd64-a46ff8030af0" containerName="oc" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.168853 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7376d4b2-418d-4217-bd64-a46ff8030af0" containerName="oc" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.170135 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.173351 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.175090 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.187147 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l"] Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.290871 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4svqj\" (UniqueName: \"kubernetes.io/projected/4bc2cf33-d61a-4781-9df6-8b973de2a7df-kube-api-access-4svqj\") pod \"collect-profiles-29565705-wn45l\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.291135 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bc2cf33-d61a-4781-9df6-8b973de2a7df-secret-volume\") pod \"collect-profiles-29565705-wn45l\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.291375 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bc2cf33-d61a-4781-9df6-8b973de2a7df-config-volume\") pod \"collect-profiles-29565705-wn45l\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.394178 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bc2cf33-d61a-4781-9df6-8b973de2a7df-secret-volume\") pod \"collect-profiles-29565705-wn45l\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.394256 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bc2cf33-d61a-4781-9df6-8b973de2a7df-config-volume\") pod \"collect-profiles-29565705-wn45l\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.394358 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4svqj\" (UniqueName: \"kubernetes.io/projected/4bc2cf33-d61a-4781-9df6-8b973de2a7df-kube-api-access-4svqj\") pod \"collect-profiles-29565705-wn45l\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.395405 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bc2cf33-d61a-4781-9df6-8b973de2a7df-config-volume\") pod \"collect-profiles-29565705-wn45l\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.404830 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bc2cf33-d61a-4781-9df6-8b973de2a7df-secret-volume\") pod \"collect-profiles-29565705-wn45l\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.414684 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4svqj\" (UniqueName: \"kubernetes.io/projected/4bc2cf33-d61a-4781-9df6-8b973de2a7df-kube-api-access-4svqj\") pod \"collect-profiles-29565705-wn45l\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.501267 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:00 crc kubenswrapper[4918]: I0319 17:45:00.995019 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l"] Mar 19 17:45:01 crc kubenswrapper[4918]: I0319 17:45:01.237671 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" event={"ID":"4bc2cf33-d61a-4781-9df6-8b973de2a7df","Type":"ContainerStarted","Data":"3630fca6f6d5875b8e3add961543ba3cbf773b77bc4e884d3a732accb276c2f0"} Mar 19 17:45:01 crc kubenswrapper[4918]: I0319 17:45:01.237940 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" event={"ID":"4bc2cf33-d61a-4781-9df6-8b973de2a7df","Type":"ContainerStarted","Data":"b35a6184786507c0a5a7a0f4044766b37c927344c2f629f4cafb4cba3710136e"} Mar 19 17:45:01 crc kubenswrapper[4918]: I0319 17:45:01.251741 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" podStartSLOduration=1.251721784 podStartE2EDuration="1.251721784s" podCreationTimestamp="2026-03-19 17:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 17:45:01.251307363 +0000 UTC m=+3913.373506611" watchObservedRunningTime="2026-03-19 17:45:01.251721784 +0000 UTC m=+3913.373921042" Mar 19 17:45:02 crc kubenswrapper[4918]: I0319 17:45:02.246982 4918 generic.go:334] "Generic (PLEG): container finished" podID="4bc2cf33-d61a-4781-9df6-8b973de2a7df" containerID="3630fca6f6d5875b8e3add961543ba3cbf773b77bc4e884d3a732accb276c2f0" exitCode=0 Mar 19 17:45:02 crc kubenswrapper[4918]: I0319 17:45:02.247148 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" event={"ID":"4bc2cf33-d61a-4781-9df6-8b973de2a7df","Type":"ContainerDied","Data":"3630fca6f6d5875b8e3add961543ba3cbf773b77bc4e884d3a732accb276c2f0"} Mar 19 17:45:03 crc kubenswrapper[4918]: I0319 17:45:03.587388 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:45:03 crc kubenswrapper[4918]: E0319 17:45:03.588129 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:45:03 crc kubenswrapper[4918]: I0319 17:45:03.781081 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:03 crc kubenswrapper[4918]: I0319 17:45:03.903158 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4svqj\" (UniqueName: \"kubernetes.io/projected/4bc2cf33-d61a-4781-9df6-8b973de2a7df-kube-api-access-4svqj\") pod \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " Mar 19 17:45:03 crc kubenswrapper[4918]: I0319 17:45:03.903237 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bc2cf33-d61a-4781-9df6-8b973de2a7df-secret-volume\") pod \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " Mar 19 17:45:03 crc kubenswrapper[4918]: I0319 17:45:03.903379 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bc2cf33-d61a-4781-9df6-8b973de2a7df-config-volume\") pod \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\" (UID: \"4bc2cf33-d61a-4781-9df6-8b973de2a7df\") " Mar 19 17:45:03 crc kubenswrapper[4918]: I0319 17:45:03.904395 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc2cf33-d61a-4781-9df6-8b973de2a7df-config-volume" (OuterVolumeSpecName: "config-volume") pod "4bc2cf33-d61a-4781-9df6-8b973de2a7df" (UID: "4bc2cf33-d61a-4781-9df6-8b973de2a7df"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 17:45:03 crc kubenswrapper[4918]: I0319 17:45:03.909083 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc2cf33-d61a-4781-9df6-8b973de2a7df-kube-api-access-4svqj" (OuterVolumeSpecName: "kube-api-access-4svqj") pod "4bc2cf33-d61a-4781-9df6-8b973de2a7df" (UID: "4bc2cf33-d61a-4781-9df6-8b973de2a7df"). InnerVolumeSpecName "kube-api-access-4svqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:45:03 crc kubenswrapper[4918]: I0319 17:45:03.910782 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc2cf33-d61a-4781-9df6-8b973de2a7df-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4bc2cf33-d61a-4781-9df6-8b973de2a7df" (UID: "4bc2cf33-d61a-4781-9df6-8b973de2a7df"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 17:45:04 crc kubenswrapper[4918]: I0319 17:45:04.006196 4918 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bc2cf33-d61a-4781-9df6-8b973de2a7df-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 17:45:04 crc kubenswrapper[4918]: I0319 17:45:04.006243 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4svqj\" (UniqueName: \"kubernetes.io/projected/4bc2cf33-d61a-4781-9df6-8b973de2a7df-kube-api-access-4svqj\") on node \"crc\" DevicePath \"\"" Mar 19 17:45:04 crc kubenswrapper[4918]: I0319 17:45:04.006266 4918 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bc2cf33-d61a-4781-9df6-8b973de2a7df-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 17:45:04 crc kubenswrapper[4918]: I0319 17:45:04.282714 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" event={"ID":"4bc2cf33-d61a-4781-9df6-8b973de2a7df","Type":"ContainerDied","Data":"b35a6184786507c0a5a7a0f4044766b37c927344c2f629f4cafb4cba3710136e"} Mar 19 17:45:04 crc kubenswrapper[4918]: I0319 17:45:04.282759 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b35a6184786507c0a5a7a0f4044766b37c927344c2f629f4cafb4cba3710136e" Mar 19 17:45:04 crc kubenswrapper[4918]: I0319 17:45:04.282838 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565705-wn45l" Mar 19 17:45:04 crc kubenswrapper[4918]: I0319 17:45:04.364054 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf"] Mar 19 17:45:04 crc kubenswrapper[4918]: I0319 17:45:04.376127 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565660-xghlf"] Mar 19 17:45:04 crc kubenswrapper[4918]: I0319 17:45:04.615697 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0" path="/var/lib/kubelet/pods/d12bbb10-b8f1-47fe-8cb1-9ce1c27e11c0/volumes" Mar 19 17:45:07 crc kubenswrapper[4918]: I0319 17:45:07.478941 4918 scope.go:117] "RemoveContainer" containerID="27bbbd91871ec1c6e6115be400bfdbe827f5eff11e7dc43f7a9066f923bfc02e" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.684733 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fscsp"] Mar 19 17:45:09 crc kubenswrapper[4918]: E0319 17:45:09.685306 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc2cf33-d61a-4781-9df6-8b973de2a7df" containerName="collect-profiles" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.685321 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc2cf33-d61a-4781-9df6-8b973de2a7df" containerName="collect-profiles" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.685590 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc2cf33-d61a-4781-9df6-8b973de2a7df" containerName="collect-profiles" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.687842 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.718142 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fscsp"] Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.850485 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-catalog-content\") pod \"community-operators-fscsp\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.850661 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx52t\" (UniqueName: \"kubernetes.io/projected/6615bd2d-9a94-4301-bdc4-5103058b37e7-kube-api-access-kx52t\") pod \"community-operators-fscsp\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.850753 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-utilities\") pod \"community-operators-fscsp\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.952997 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx52t\" (UniqueName: \"kubernetes.io/projected/6615bd2d-9a94-4301-bdc4-5103058b37e7-kube-api-access-kx52t\") pod \"community-operators-fscsp\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.953096 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-utilities\") pod \"community-operators-fscsp\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.953441 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-catalog-content\") pod \"community-operators-fscsp\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.954256 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-catalog-content\") pod \"community-operators-fscsp\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.955315 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-utilities\") pod \"community-operators-fscsp\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:09 crc kubenswrapper[4918]: I0319 17:45:09.991507 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx52t\" (UniqueName: \"kubernetes.io/projected/6615bd2d-9a94-4301-bdc4-5103058b37e7-kube-api-access-kx52t\") pod \"community-operators-fscsp\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:10 crc kubenswrapper[4918]: I0319 17:45:10.010749 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:10 crc kubenswrapper[4918]: I0319 17:45:10.510903 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fscsp"] Mar 19 17:45:11 crc kubenswrapper[4918]: I0319 17:45:11.375225 4918 generic.go:334] "Generic (PLEG): container finished" podID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerID="1e1448e93a4244ebdd942851f1319e5e4eb78a0a0c67f2fe47d9c8af6a677353" exitCode=0 Mar 19 17:45:11 crc kubenswrapper[4918]: I0319 17:45:11.375327 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fscsp" event={"ID":"6615bd2d-9a94-4301-bdc4-5103058b37e7","Type":"ContainerDied","Data":"1e1448e93a4244ebdd942851f1319e5e4eb78a0a0c67f2fe47d9c8af6a677353"} Mar 19 17:45:11 crc kubenswrapper[4918]: I0319 17:45:11.375541 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fscsp" event={"ID":"6615bd2d-9a94-4301-bdc4-5103058b37e7","Type":"ContainerStarted","Data":"c8c14f8302a738abf0beb73cb625c79ce19a638016ef0bd64ec9f7cc8a12fcc7"} Mar 19 17:45:13 crc kubenswrapper[4918]: I0319 17:45:13.408738 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fscsp" event={"ID":"6615bd2d-9a94-4301-bdc4-5103058b37e7","Type":"ContainerStarted","Data":"ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54"} Mar 19 17:45:14 crc kubenswrapper[4918]: I0319 17:45:14.422450 4918 generic.go:334] "Generic (PLEG): container finished" podID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerID="ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54" exitCode=0 Mar 19 17:45:14 crc kubenswrapper[4918]: I0319 17:45:14.422561 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fscsp" event={"ID":"6615bd2d-9a94-4301-bdc4-5103058b37e7","Type":"ContainerDied","Data":"ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54"} Mar 19 17:45:14 crc kubenswrapper[4918]: I0319 17:45:14.587216 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:45:14 crc kubenswrapper[4918]: E0319 17:45:14.587718 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:45:15 crc kubenswrapper[4918]: I0319 17:45:15.437101 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fscsp" event={"ID":"6615bd2d-9a94-4301-bdc4-5103058b37e7","Type":"ContainerStarted","Data":"d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e"} Mar 19 17:45:15 crc kubenswrapper[4918]: I0319 17:45:15.468388 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fscsp" podStartSLOduration=2.987340197 podStartE2EDuration="6.468363132s" podCreationTimestamp="2026-03-19 17:45:09 +0000 UTC" firstStartedPulling="2026-03-19 17:45:11.377768476 +0000 UTC m=+3923.499967764" lastFinishedPulling="2026-03-19 17:45:14.858791411 +0000 UTC m=+3926.980990699" observedRunningTime="2026-03-19 17:45:15.460190338 +0000 UTC m=+3927.582389606" watchObservedRunningTime="2026-03-19 17:45:15.468363132 +0000 UTC m=+3927.590562390" Mar 19 17:45:20 crc kubenswrapper[4918]: I0319 17:45:20.011167 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:20 crc kubenswrapper[4918]: I0319 17:45:20.012106 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:21 crc kubenswrapper[4918]: I0319 17:45:21.066445 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fscsp" podUID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerName="registry-server" probeResult="failure" output=< Mar 19 17:45:21 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 17:45:21 crc kubenswrapper[4918]: > Mar 19 17:45:28 crc kubenswrapper[4918]: I0319 17:45:28.602109 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:45:28 crc kubenswrapper[4918]: E0319 17:45:28.603062 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:45:30 crc kubenswrapper[4918]: I0319 17:45:30.097419 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:30 crc kubenswrapper[4918]: I0319 17:45:30.165455 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:30 crc kubenswrapper[4918]: I0319 17:45:30.342718 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fscsp"] Mar 19 17:45:31 crc kubenswrapper[4918]: I0319 17:45:31.631062 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fscsp" podUID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerName="registry-server" containerID="cri-o://d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e" gracePeriod=2 Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.260930 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.370447 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx52t\" (UniqueName: \"kubernetes.io/projected/6615bd2d-9a94-4301-bdc4-5103058b37e7-kube-api-access-kx52t\") pod \"6615bd2d-9a94-4301-bdc4-5103058b37e7\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.371063 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-utilities\") pod \"6615bd2d-9a94-4301-bdc4-5103058b37e7\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.371227 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-catalog-content\") pod \"6615bd2d-9a94-4301-bdc4-5103058b37e7\" (UID: \"6615bd2d-9a94-4301-bdc4-5103058b37e7\") " Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.371754 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-utilities" (OuterVolumeSpecName: "utilities") pod "6615bd2d-9a94-4301-bdc4-5103058b37e7" (UID: "6615bd2d-9a94-4301-bdc4-5103058b37e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.376483 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6615bd2d-9a94-4301-bdc4-5103058b37e7-kube-api-access-kx52t" (OuterVolumeSpecName: "kube-api-access-kx52t") pod "6615bd2d-9a94-4301-bdc4-5103058b37e7" (UID: "6615bd2d-9a94-4301-bdc4-5103058b37e7"). InnerVolumeSpecName "kube-api-access-kx52t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.418586 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6615bd2d-9a94-4301-bdc4-5103058b37e7" (UID: "6615bd2d-9a94-4301-bdc4-5103058b37e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.473345 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.473376 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx52t\" (UniqueName: \"kubernetes.io/projected/6615bd2d-9a94-4301-bdc4-5103058b37e7-kube-api-access-kx52t\") on node \"crc\" DevicePath \"\"" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.473389 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6615bd2d-9a94-4301-bdc4-5103058b37e7-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.645512 4918 generic.go:334] "Generic (PLEG): container finished" podID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerID="d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e" exitCode=0 Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.645567 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fscsp" event={"ID":"6615bd2d-9a94-4301-bdc4-5103058b37e7","Type":"ContainerDied","Data":"d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e"} Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.645674 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fscsp" event={"ID":"6615bd2d-9a94-4301-bdc4-5103058b37e7","Type":"ContainerDied","Data":"c8c14f8302a738abf0beb73cb625c79ce19a638016ef0bd64ec9f7cc8a12fcc7"} Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.645711 4918 scope.go:117] "RemoveContainer" containerID="d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.645597 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fscsp" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.676770 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fscsp"] Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.686736 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fscsp"] Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.688766 4918 scope.go:117] "RemoveContainer" containerID="ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.722252 4918 scope.go:117] "RemoveContainer" containerID="1e1448e93a4244ebdd942851f1319e5e4eb78a0a0c67f2fe47d9c8af6a677353" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.760942 4918 scope.go:117] "RemoveContainer" containerID="d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e" Mar 19 17:45:32 crc kubenswrapper[4918]: E0319 17:45:32.761302 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e\": container with ID starting with d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e not found: ID does not exist" containerID="d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.761352 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e"} err="failed to get container status \"d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e\": rpc error: code = NotFound desc = could not find container \"d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e\": container with ID starting with d939ebc6b5fe62e0f5b025a1f1f7e3477efd0fb5cca301a5bcb0c7e8c578c29e not found: ID does not exist" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.761379 4918 scope.go:117] "RemoveContainer" containerID="ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54" Mar 19 17:45:32 crc kubenswrapper[4918]: E0319 17:45:32.761718 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54\": container with ID starting with ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54 not found: ID does not exist" containerID="ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.761754 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54"} err="failed to get container status \"ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54\": rpc error: code = NotFound desc = could not find container \"ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54\": container with ID starting with ceba8f761850ad9a40dd83d5b0c56b8beb46fb5b9eb139d41c13202a83973a54 not found: ID does not exist" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.761774 4918 scope.go:117] "RemoveContainer" containerID="1e1448e93a4244ebdd942851f1319e5e4eb78a0a0c67f2fe47d9c8af6a677353" Mar 19 17:45:32 crc kubenswrapper[4918]: E0319 17:45:32.763210 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e1448e93a4244ebdd942851f1319e5e4eb78a0a0c67f2fe47d9c8af6a677353\": container with ID starting with 1e1448e93a4244ebdd942851f1319e5e4eb78a0a0c67f2fe47d9c8af6a677353 not found: ID does not exist" containerID="1e1448e93a4244ebdd942851f1319e5e4eb78a0a0c67f2fe47d9c8af6a677353" Mar 19 17:45:32 crc kubenswrapper[4918]: I0319 17:45:32.763264 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e1448e93a4244ebdd942851f1319e5e4eb78a0a0c67f2fe47d9c8af6a677353"} err="failed to get container status \"1e1448e93a4244ebdd942851f1319e5e4eb78a0a0c67f2fe47d9c8af6a677353\": rpc error: code = NotFound desc = could not find container \"1e1448e93a4244ebdd942851f1319e5e4eb78a0a0c67f2fe47d9c8af6a677353\": container with ID starting with 1e1448e93a4244ebdd942851f1319e5e4eb78a0a0c67f2fe47d9c8af6a677353 not found: ID does not exist" Mar 19 17:45:34 crc kubenswrapper[4918]: I0319 17:45:34.598432 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6615bd2d-9a94-4301-bdc4-5103058b37e7" path="/var/lib/kubelet/pods/6615bd2d-9a94-4301-bdc4-5103058b37e7/volumes" Mar 19 17:45:41 crc kubenswrapper[4918]: I0319 17:45:41.587122 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:45:41 crc kubenswrapper[4918]: E0319 17:45:41.588089 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.749765 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6qdvx"] Mar 19 17:45:47 crc kubenswrapper[4918]: E0319 17:45:47.752125 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerName="extract-utilities" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.752252 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerName="extract-utilities" Mar 19 17:45:47 crc kubenswrapper[4918]: E0319 17:45:47.752354 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerName="extract-content" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.752436 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerName="extract-content" Mar 19 17:45:47 crc kubenswrapper[4918]: E0319 17:45:47.752560 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerName="registry-server" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.752650 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerName="registry-server" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.753008 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="6615bd2d-9a94-4301-bdc4-5103058b37e7" containerName="registry-server" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.755348 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.784492 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qdvx"] Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.886343 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-catalog-content\") pod \"redhat-marketplace-6qdvx\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.886492 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxt92\" (UniqueName: \"kubernetes.io/projected/84bb0858-1033-4e17-8c4d-a76838ad968e-kube-api-access-hxt92\") pod \"redhat-marketplace-6qdvx\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.886791 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-utilities\") pod \"redhat-marketplace-6qdvx\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.989380 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-utilities\") pod \"redhat-marketplace-6qdvx\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.989576 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-catalog-content\") pod \"redhat-marketplace-6qdvx\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.989648 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxt92\" (UniqueName: \"kubernetes.io/projected/84bb0858-1033-4e17-8c4d-a76838ad968e-kube-api-access-hxt92\") pod \"redhat-marketplace-6qdvx\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.989900 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-utilities\") pod \"redhat-marketplace-6qdvx\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:47 crc kubenswrapper[4918]: I0319 17:45:47.990085 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-catalog-content\") pod \"redhat-marketplace-6qdvx\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:48 crc kubenswrapper[4918]: I0319 17:45:48.021563 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxt92\" (UniqueName: \"kubernetes.io/projected/84bb0858-1033-4e17-8c4d-a76838ad968e-kube-api-access-hxt92\") pod \"redhat-marketplace-6qdvx\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:48 crc kubenswrapper[4918]: I0319 17:45:48.076544 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:48 crc kubenswrapper[4918]: I0319 17:45:48.583139 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qdvx"] Mar 19 17:45:48 crc kubenswrapper[4918]: I0319 17:45:48.814807 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qdvx" event={"ID":"84bb0858-1033-4e17-8c4d-a76838ad968e","Type":"ContainerStarted","Data":"7f9644b9f3c2afe5e4d98738ed02067e60a8b668be2bc102613fae8d6ca5a582"} Mar 19 17:45:49 crc kubenswrapper[4918]: I0319 17:45:49.829095 4918 generic.go:334] "Generic (PLEG): container finished" podID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerID="7287b580e05ce057e66b2201312454846d03fb4e67d4bf47b63d0202c74b4d5e" exitCode=0 Mar 19 17:45:49 crc kubenswrapper[4918]: I0319 17:45:49.829182 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qdvx" event={"ID":"84bb0858-1033-4e17-8c4d-a76838ad968e","Type":"ContainerDied","Data":"7287b580e05ce057e66b2201312454846d03fb4e67d4bf47b63d0202c74b4d5e"} Mar 19 17:45:50 crc kubenswrapper[4918]: I0319 17:45:50.839318 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qdvx" event={"ID":"84bb0858-1033-4e17-8c4d-a76838ad968e","Type":"ContainerStarted","Data":"8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a"} Mar 19 17:45:51 crc kubenswrapper[4918]: I0319 17:45:51.852754 4918 generic.go:334] "Generic (PLEG): container finished" podID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerID="8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a" exitCode=0 Mar 19 17:45:51 crc kubenswrapper[4918]: I0319 17:45:51.852850 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qdvx" event={"ID":"84bb0858-1033-4e17-8c4d-a76838ad968e","Type":"ContainerDied","Data":"8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a"} Mar 19 17:45:52 crc kubenswrapper[4918]: I0319 17:45:52.875579 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qdvx" event={"ID":"84bb0858-1033-4e17-8c4d-a76838ad968e","Type":"ContainerStarted","Data":"7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b"} Mar 19 17:45:52 crc kubenswrapper[4918]: I0319 17:45:52.907490 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6qdvx" podStartSLOduration=3.443112767 podStartE2EDuration="5.907472596s" podCreationTimestamp="2026-03-19 17:45:47 +0000 UTC" firstStartedPulling="2026-03-19 17:45:49.831829526 +0000 UTC m=+3961.954028794" lastFinishedPulling="2026-03-19 17:45:52.296189375 +0000 UTC m=+3964.418388623" observedRunningTime="2026-03-19 17:45:52.903264311 +0000 UTC m=+3965.025463559" watchObservedRunningTime="2026-03-19 17:45:52.907472596 +0000 UTC m=+3965.029671874" Mar 19 17:45:56 crc kubenswrapper[4918]: I0319 17:45:56.587012 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:45:56 crc kubenswrapper[4918]: E0319 17:45:56.587935 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:45:58 crc kubenswrapper[4918]: I0319 17:45:58.077808 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:58 crc kubenswrapper[4918]: I0319 17:45:58.078222 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:45:59 crc kubenswrapper[4918]: I0319 17:45:59.149884 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-6qdvx" podUID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerName="registry-server" probeResult="failure" output=< Mar 19 17:45:59 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 17:45:59 crc kubenswrapper[4918]: > Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.146985 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565706-glw7c"] Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.150999 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565706-glw7c" Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.153547 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.154056 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.154166 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.157063 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565706-glw7c"] Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.256843 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkkfz\" (UniqueName: \"kubernetes.io/projected/ba98ddb7-0e6c-44f9-b203-b691d84af9fd-kube-api-access-jkkfz\") pod \"auto-csr-approver-29565706-glw7c\" (UID: \"ba98ddb7-0e6c-44f9-b203-b691d84af9fd\") " pod="openshift-infra/auto-csr-approver-29565706-glw7c" Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.359836 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkkfz\" (UniqueName: \"kubernetes.io/projected/ba98ddb7-0e6c-44f9-b203-b691d84af9fd-kube-api-access-jkkfz\") pod \"auto-csr-approver-29565706-glw7c\" (UID: \"ba98ddb7-0e6c-44f9-b203-b691d84af9fd\") " pod="openshift-infra/auto-csr-approver-29565706-glw7c" Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.389128 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkkfz\" (UniqueName: \"kubernetes.io/projected/ba98ddb7-0e6c-44f9-b203-b691d84af9fd-kube-api-access-jkkfz\") pod \"auto-csr-approver-29565706-glw7c\" (UID: \"ba98ddb7-0e6c-44f9-b203-b691d84af9fd\") " pod="openshift-infra/auto-csr-approver-29565706-glw7c" Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.473888 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565706-glw7c" Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.960486 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565706-glw7c"] Mar 19 17:46:00 crc kubenswrapper[4918]: W0319 17:46:00.963624 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba98ddb7_0e6c_44f9_b203_b691d84af9fd.slice/crio-e8d81f9b0f1f307ce1a725ee3b91b22bf70fd1a879495a602207c4d3510e3d18 WatchSource:0}: Error finding container e8d81f9b0f1f307ce1a725ee3b91b22bf70fd1a879495a602207c4d3510e3d18: Status 404 returned error can't find the container with id e8d81f9b0f1f307ce1a725ee3b91b22bf70fd1a879495a602207c4d3510e3d18 Mar 19 17:46:00 crc kubenswrapper[4918]: I0319 17:46:00.988380 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565706-glw7c" event={"ID":"ba98ddb7-0e6c-44f9-b203-b691d84af9fd","Type":"ContainerStarted","Data":"e8d81f9b0f1f307ce1a725ee3b91b22bf70fd1a879495a602207c4d3510e3d18"} Mar 19 17:46:03 crc kubenswrapper[4918]: I0319 17:46:03.019599 4918 generic.go:334] "Generic (PLEG): container finished" podID="ba98ddb7-0e6c-44f9-b203-b691d84af9fd" containerID="c42e0672631f620869a39334b8075c0886a01bf6d5353f3b356bf7b8fd7d280c" exitCode=0 Mar 19 17:46:03 crc kubenswrapper[4918]: I0319 17:46:03.019701 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565706-glw7c" event={"ID":"ba98ddb7-0e6c-44f9-b203-b691d84af9fd","Type":"ContainerDied","Data":"c42e0672631f620869a39334b8075c0886a01bf6d5353f3b356bf7b8fd7d280c"} Mar 19 17:46:04 crc kubenswrapper[4918]: I0319 17:46:04.445923 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565706-glw7c" Mar 19 17:46:04 crc kubenswrapper[4918]: I0319 17:46:04.556287 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkkfz\" (UniqueName: \"kubernetes.io/projected/ba98ddb7-0e6c-44f9-b203-b691d84af9fd-kube-api-access-jkkfz\") pod \"ba98ddb7-0e6c-44f9-b203-b691d84af9fd\" (UID: \"ba98ddb7-0e6c-44f9-b203-b691d84af9fd\") " Mar 19 17:46:04 crc kubenswrapper[4918]: I0319 17:46:04.566472 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba98ddb7-0e6c-44f9-b203-b691d84af9fd-kube-api-access-jkkfz" (OuterVolumeSpecName: "kube-api-access-jkkfz") pod "ba98ddb7-0e6c-44f9-b203-b691d84af9fd" (UID: "ba98ddb7-0e6c-44f9-b203-b691d84af9fd"). InnerVolumeSpecName "kube-api-access-jkkfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:46:04 crc kubenswrapper[4918]: I0319 17:46:04.658986 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkkfz\" (UniqueName: \"kubernetes.io/projected/ba98ddb7-0e6c-44f9-b203-b691d84af9fd-kube-api-access-jkkfz\") on node \"crc\" DevicePath \"\"" Mar 19 17:46:05 crc kubenswrapper[4918]: I0319 17:46:05.048570 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565706-glw7c" event={"ID":"ba98ddb7-0e6c-44f9-b203-b691d84af9fd","Type":"ContainerDied","Data":"e8d81f9b0f1f307ce1a725ee3b91b22bf70fd1a879495a602207c4d3510e3d18"} Mar 19 17:46:05 crc kubenswrapper[4918]: I0319 17:46:05.048639 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8d81f9b0f1f307ce1a725ee3b91b22bf70fd1a879495a602207c4d3510e3d18" Mar 19 17:46:05 crc kubenswrapper[4918]: I0319 17:46:05.048661 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565706-glw7c" Mar 19 17:46:05 crc kubenswrapper[4918]: I0319 17:46:05.548696 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565700-kvkq9"] Mar 19 17:46:05 crc kubenswrapper[4918]: I0319 17:46:05.556804 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565700-kvkq9"] Mar 19 17:46:06 crc kubenswrapper[4918]: I0319 17:46:06.602007 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8356a087-24b9-42c3-8ce1-7513399baf96" path="/var/lib/kubelet/pods/8356a087-24b9-42c3-8ce1-7513399baf96/volumes" Mar 19 17:46:07 crc kubenswrapper[4918]: I0319 17:46:07.562080 4918 scope.go:117] "RemoveContainer" containerID="e748dd98e92e05cd699650d27df5a4aa5598f3fee477657aec434ae8878a80e6" Mar 19 17:46:08 crc kubenswrapper[4918]: I0319 17:46:08.153572 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:46:08 crc kubenswrapper[4918]: I0319 17:46:08.205429 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:46:08 crc kubenswrapper[4918]: I0319 17:46:08.404473 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qdvx"] Mar 19 17:46:08 crc kubenswrapper[4918]: I0319 17:46:08.599178 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:46:09 crc kubenswrapper[4918]: I0319 17:46:09.108398 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"3bea884c178a9044a006f9f35f9e71bc3b7822aaa4688753f27be06ee609e25e"} Mar 19 17:46:10 crc kubenswrapper[4918]: I0319 17:46:10.123695 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6qdvx" podUID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerName="registry-server" containerID="cri-o://7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b" gracePeriod=2 Mar 19 17:46:10 crc kubenswrapper[4918]: I0319 17:46:10.737849 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:46:10 crc kubenswrapper[4918]: I0319 17:46:10.914698 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-utilities\") pod \"84bb0858-1033-4e17-8c4d-a76838ad968e\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " Mar 19 17:46:10 crc kubenswrapper[4918]: I0319 17:46:10.914928 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-catalog-content\") pod \"84bb0858-1033-4e17-8c4d-a76838ad968e\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " Mar 19 17:46:10 crc kubenswrapper[4918]: I0319 17:46:10.914981 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxt92\" (UniqueName: \"kubernetes.io/projected/84bb0858-1033-4e17-8c4d-a76838ad968e-kube-api-access-hxt92\") pod \"84bb0858-1033-4e17-8c4d-a76838ad968e\" (UID: \"84bb0858-1033-4e17-8c4d-a76838ad968e\") " Mar 19 17:46:10 crc kubenswrapper[4918]: I0319 17:46:10.918334 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-utilities" (OuterVolumeSpecName: "utilities") pod "84bb0858-1033-4e17-8c4d-a76838ad968e" (UID: "84bb0858-1033-4e17-8c4d-a76838ad968e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:46:10 crc kubenswrapper[4918]: I0319 17:46:10.926691 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84bb0858-1033-4e17-8c4d-a76838ad968e-kube-api-access-hxt92" (OuterVolumeSpecName: "kube-api-access-hxt92") pod "84bb0858-1033-4e17-8c4d-a76838ad968e" (UID: "84bb0858-1033-4e17-8c4d-a76838ad968e"). InnerVolumeSpecName "kube-api-access-hxt92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:46:10 crc kubenswrapper[4918]: I0319 17:46:10.956384 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84bb0858-1033-4e17-8c4d-a76838ad968e" (UID: "84bb0858-1033-4e17-8c4d-a76838ad968e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.017977 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.018008 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxt92\" (UniqueName: \"kubernetes.io/projected/84bb0858-1033-4e17-8c4d-a76838ad968e-kube-api-access-hxt92\") on node \"crc\" DevicePath \"\"" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.018017 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84bb0858-1033-4e17-8c4d-a76838ad968e-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.152374 4918 generic.go:334] "Generic (PLEG): container finished" podID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerID="7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b" exitCode=0 Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.152607 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qdvx" event={"ID":"84bb0858-1033-4e17-8c4d-a76838ad968e","Type":"ContainerDied","Data":"7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b"} Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.152885 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qdvx" event={"ID":"84bb0858-1033-4e17-8c4d-a76838ad968e","Type":"ContainerDied","Data":"7f9644b9f3c2afe5e4d98738ed02067e60a8b668be2bc102613fae8d6ca5a582"} Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.152925 4918 scope.go:117] "RemoveContainer" containerID="7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.152711 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qdvx" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.187881 4918 scope.go:117] "RemoveContainer" containerID="8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.205590 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qdvx"] Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.216246 4918 scope.go:117] "RemoveContainer" containerID="7287b580e05ce057e66b2201312454846d03fb4e67d4bf47b63d0202c74b4d5e" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.226129 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qdvx"] Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.262834 4918 scope.go:117] "RemoveContainer" containerID="7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b" Mar 19 17:46:11 crc kubenswrapper[4918]: E0319 17:46:11.270805 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b\": container with ID starting with 7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b not found: ID does not exist" containerID="7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.270849 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b"} err="failed to get container status \"7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b\": rpc error: code = NotFound desc = could not find container \"7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b\": container with ID starting with 7772d7116da68c2766365828157d727147a6912914c03b32eadf9db88079f22b not found: ID does not exist" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.270879 4918 scope.go:117] "RemoveContainer" containerID="8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a" Mar 19 17:46:11 crc kubenswrapper[4918]: E0319 17:46:11.271172 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a\": container with ID starting with 8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a not found: ID does not exist" containerID="8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.271201 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a"} err="failed to get container status \"8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a\": rpc error: code = NotFound desc = could not find container \"8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a\": container with ID starting with 8c5829a9b448e25f880c73b1ea810c8ba6721a639ecefc3a69be314021049c9a not found: ID does not exist" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.271218 4918 scope.go:117] "RemoveContainer" containerID="7287b580e05ce057e66b2201312454846d03fb4e67d4bf47b63d0202c74b4d5e" Mar 19 17:46:11 crc kubenswrapper[4918]: E0319 17:46:11.271927 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7287b580e05ce057e66b2201312454846d03fb4e67d4bf47b63d0202c74b4d5e\": container with ID starting with 7287b580e05ce057e66b2201312454846d03fb4e67d4bf47b63d0202c74b4d5e not found: ID does not exist" containerID="7287b580e05ce057e66b2201312454846d03fb4e67d4bf47b63d0202c74b4d5e" Mar 19 17:46:11 crc kubenswrapper[4918]: I0319 17:46:11.271967 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7287b580e05ce057e66b2201312454846d03fb4e67d4bf47b63d0202c74b4d5e"} err="failed to get container status \"7287b580e05ce057e66b2201312454846d03fb4e67d4bf47b63d0202c74b4d5e\": rpc error: code = NotFound desc = could not find container \"7287b580e05ce057e66b2201312454846d03fb4e67d4bf47b63d0202c74b4d5e\": container with ID starting with 7287b580e05ce057e66b2201312454846d03fb4e67d4bf47b63d0202c74b4d5e not found: ID does not exist" Mar 19 17:46:12 crc kubenswrapper[4918]: I0319 17:46:12.610129 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84bb0858-1033-4e17-8c4d-a76838ad968e" path="/var/lib/kubelet/pods/84bb0858-1033-4e17-8c4d-a76838ad968e/volumes" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.159929 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565708-xl6gf"] Mar 19 17:48:00 crc kubenswrapper[4918]: E0319 17:48:00.161017 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba98ddb7-0e6c-44f9-b203-b691d84af9fd" containerName="oc" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.161034 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba98ddb7-0e6c-44f9-b203-b691d84af9fd" containerName="oc" Mar 19 17:48:00 crc kubenswrapper[4918]: E0319 17:48:00.161057 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerName="extract-content" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.161065 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerName="extract-content" Mar 19 17:48:00 crc kubenswrapper[4918]: E0319 17:48:00.161079 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerName="extract-utilities" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.161087 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerName="extract-utilities" Mar 19 17:48:00 crc kubenswrapper[4918]: E0319 17:48:00.161123 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerName="registry-server" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.161132 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerName="registry-server" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.161375 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba98ddb7-0e6c-44f9-b203-b691d84af9fd" containerName="oc" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.161393 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="84bb0858-1033-4e17-8c4d-a76838ad968e" containerName="registry-server" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.162249 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565708-xl6gf" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.166400 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.166832 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.167957 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.175175 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565708-xl6gf"] Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.272080 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz8zr\" (UniqueName: \"kubernetes.io/projected/989e7ccb-0088-4f44-9cdf-e937bf3c99f9-kube-api-access-tz8zr\") pod \"auto-csr-approver-29565708-xl6gf\" (UID: \"989e7ccb-0088-4f44-9cdf-e937bf3c99f9\") " pod="openshift-infra/auto-csr-approver-29565708-xl6gf" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.373994 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz8zr\" (UniqueName: \"kubernetes.io/projected/989e7ccb-0088-4f44-9cdf-e937bf3c99f9-kube-api-access-tz8zr\") pod \"auto-csr-approver-29565708-xl6gf\" (UID: \"989e7ccb-0088-4f44-9cdf-e937bf3c99f9\") " pod="openshift-infra/auto-csr-approver-29565708-xl6gf" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.396369 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz8zr\" (UniqueName: \"kubernetes.io/projected/989e7ccb-0088-4f44-9cdf-e937bf3c99f9-kube-api-access-tz8zr\") pod \"auto-csr-approver-29565708-xl6gf\" (UID: \"989e7ccb-0088-4f44-9cdf-e937bf3c99f9\") " pod="openshift-infra/auto-csr-approver-29565708-xl6gf" Mar 19 17:48:00 crc kubenswrapper[4918]: I0319 17:48:00.500488 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565708-xl6gf" Mar 19 17:48:01 crc kubenswrapper[4918]: I0319 17:48:01.029740 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565708-xl6gf"] Mar 19 17:48:01 crc kubenswrapper[4918]: W0319 17:48:01.037930 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod989e7ccb_0088_4f44_9cdf_e937bf3c99f9.slice/crio-bbdf3e4ee519db2d3dbd4e4867f6697bd9925993be538cf4e7272157b27c0001 WatchSource:0}: Error finding container bbdf3e4ee519db2d3dbd4e4867f6697bd9925993be538cf4e7272157b27c0001: Status 404 returned error can't find the container with id bbdf3e4ee519db2d3dbd4e4867f6697bd9925993be538cf4e7272157b27c0001 Mar 19 17:48:01 crc kubenswrapper[4918]: I0319 17:48:01.041560 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 17:48:01 crc kubenswrapper[4918]: I0319 17:48:01.538256 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565708-xl6gf" event={"ID":"989e7ccb-0088-4f44-9cdf-e937bf3c99f9","Type":"ContainerStarted","Data":"bbdf3e4ee519db2d3dbd4e4867f6697bd9925993be538cf4e7272157b27c0001"} Mar 19 17:48:02 crc kubenswrapper[4918]: I0319 17:48:02.551030 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565708-xl6gf" event={"ID":"989e7ccb-0088-4f44-9cdf-e937bf3c99f9","Type":"ContainerStarted","Data":"df39aac482f4ccaf8bfc18be924ff795f381833e6327b1f20be4a703e54beca1"} Mar 19 17:48:03 crc kubenswrapper[4918]: I0319 17:48:03.565094 4918 generic.go:334] "Generic (PLEG): container finished" podID="989e7ccb-0088-4f44-9cdf-e937bf3c99f9" containerID="df39aac482f4ccaf8bfc18be924ff795f381833e6327b1f20be4a703e54beca1" exitCode=0 Mar 19 17:48:03 crc kubenswrapper[4918]: I0319 17:48:03.565267 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565708-xl6gf" event={"ID":"989e7ccb-0088-4f44-9cdf-e937bf3c99f9","Type":"ContainerDied","Data":"df39aac482f4ccaf8bfc18be924ff795f381833e6327b1f20be4a703e54beca1"} Mar 19 17:48:05 crc kubenswrapper[4918]: I0319 17:48:05.033545 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565708-xl6gf" Mar 19 17:48:05 crc kubenswrapper[4918]: I0319 17:48:05.181919 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz8zr\" (UniqueName: \"kubernetes.io/projected/989e7ccb-0088-4f44-9cdf-e937bf3c99f9-kube-api-access-tz8zr\") pod \"989e7ccb-0088-4f44-9cdf-e937bf3c99f9\" (UID: \"989e7ccb-0088-4f44-9cdf-e937bf3c99f9\") " Mar 19 17:48:05 crc kubenswrapper[4918]: I0319 17:48:05.189032 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989e7ccb-0088-4f44-9cdf-e937bf3c99f9-kube-api-access-tz8zr" (OuterVolumeSpecName: "kube-api-access-tz8zr") pod "989e7ccb-0088-4f44-9cdf-e937bf3c99f9" (UID: "989e7ccb-0088-4f44-9cdf-e937bf3c99f9"). InnerVolumeSpecName "kube-api-access-tz8zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:48:05 crc kubenswrapper[4918]: I0319 17:48:05.284432 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz8zr\" (UniqueName: \"kubernetes.io/projected/989e7ccb-0088-4f44-9cdf-e937bf3c99f9-kube-api-access-tz8zr\") on node \"crc\" DevicePath \"\"" Mar 19 17:48:05 crc kubenswrapper[4918]: I0319 17:48:05.584913 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565708-xl6gf" event={"ID":"989e7ccb-0088-4f44-9cdf-e937bf3c99f9","Type":"ContainerDied","Data":"bbdf3e4ee519db2d3dbd4e4867f6697bd9925993be538cf4e7272157b27c0001"} Mar 19 17:48:05 crc kubenswrapper[4918]: I0319 17:48:05.584948 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbdf3e4ee519db2d3dbd4e4867f6697bd9925993be538cf4e7272157b27c0001" Mar 19 17:48:05 crc kubenswrapper[4918]: I0319 17:48:05.585293 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565708-xl6gf" Mar 19 17:48:05 crc kubenswrapper[4918]: I0319 17:48:05.644620 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565702-bs44m"] Mar 19 17:48:05 crc kubenswrapper[4918]: I0319 17:48:05.652477 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565702-bs44m"] Mar 19 17:48:06 crc kubenswrapper[4918]: I0319 17:48:06.606852 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a54cf28-1fdf-4640-a48d-16ecfe834e38" path="/var/lib/kubelet/pods/0a54cf28-1fdf-4640-a48d-16ecfe834e38/volumes" Mar 19 17:48:07 crc kubenswrapper[4918]: I0319 17:48:07.722694 4918 scope.go:117] "RemoveContainer" containerID="123733cdca667a2cd1384999e1a0daae27b9ceac075cfbfb9dd26382dad1f7a9" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.035599 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9gcdh"] Mar 19 17:48:10 crc kubenswrapper[4918]: E0319 17:48:10.037176 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989e7ccb-0088-4f44-9cdf-e937bf3c99f9" containerName="oc" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.037207 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="989e7ccb-0088-4f44-9cdf-e937bf3c99f9" containerName="oc" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.037721 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="989e7ccb-0088-4f44-9cdf-e937bf3c99f9" containerName="oc" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.042561 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.049787 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9gcdh"] Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.100287 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c29kr\" (UniqueName: \"kubernetes.io/projected/1b613460-d25d-4895-87ef-2f1ba9c22bf9-kube-api-access-c29kr\") pod \"certified-operators-9gcdh\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.100577 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-catalog-content\") pod \"certified-operators-9gcdh\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.100633 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-utilities\") pod \"certified-operators-9gcdh\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.202631 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c29kr\" (UniqueName: \"kubernetes.io/projected/1b613460-d25d-4895-87ef-2f1ba9c22bf9-kube-api-access-c29kr\") pod \"certified-operators-9gcdh\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.203242 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-catalog-content\") pod \"certified-operators-9gcdh\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.203430 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-utilities\") pod \"certified-operators-9gcdh\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.204223 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-utilities\") pod \"certified-operators-9gcdh\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.205128 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-catalog-content\") pod \"certified-operators-9gcdh\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.228786 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c29kr\" (UniqueName: \"kubernetes.io/projected/1b613460-d25d-4895-87ef-2f1ba9c22bf9-kube-api-access-c29kr\") pod \"certified-operators-9gcdh\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.389053 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:10 crc kubenswrapper[4918]: I0319 17:48:10.906892 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9gcdh"] Mar 19 17:48:11 crc kubenswrapper[4918]: I0319 17:48:11.652200 4918 generic.go:334] "Generic (PLEG): container finished" podID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" containerID="6d3c7d4dba844a97c5b03fb1eaa8a6ef0cd68077463c66c74b16da784e7f5e50" exitCode=0 Mar 19 17:48:11 crc kubenswrapper[4918]: I0319 17:48:11.652274 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gcdh" event={"ID":"1b613460-d25d-4895-87ef-2f1ba9c22bf9","Type":"ContainerDied","Data":"6d3c7d4dba844a97c5b03fb1eaa8a6ef0cd68077463c66c74b16da784e7f5e50"} Mar 19 17:48:11 crc kubenswrapper[4918]: I0319 17:48:11.652501 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gcdh" event={"ID":"1b613460-d25d-4895-87ef-2f1ba9c22bf9","Type":"ContainerStarted","Data":"76a2a1b45f181f30a6eeb1e1a89d40946a2b319241ba26527c39352f5eeb1962"} Mar 19 17:48:12 crc kubenswrapper[4918]: I0319 17:48:12.666877 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gcdh" event={"ID":"1b613460-d25d-4895-87ef-2f1ba9c22bf9","Type":"ContainerStarted","Data":"771eedcd8a8d813ce5d294fbc1d66f99313597770acfaaf2e3c150f7300aa18a"} Mar 19 17:48:14 crc kubenswrapper[4918]: I0319 17:48:14.696772 4918 generic.go:334] "Generic (PLEG): container finished" podID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" containerID="771eedcd8a8d813ce5d294fbc1d66f99313597770acfaaf2e3c150f7300aa18a" exitCode=0 Mar 19 17:48:14 crc kubenswrapper[4918]: I0319 17:48:14.697016 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gcdh" event={"ID":"1b613460-d25d-4895-87ef-2f1ba9c22bf9","Type":"ContainerDied","Data":"771eedcd8a8d813ce5d294fbc1d66f99313597770acfaaf2e3c150f7300aa18a"} Mar 19 17:48:15 crc kubenswrapper[4918]: I0319 17:48:15.716082 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gcdh" event={"ID":"1b613460-d25d-4895-87ef-2f1ba9c22bf9","Type":"ContainerStarted","Data":"990e743bf4993e45a7a68d4c44f6aeb9c01be47e2be22ac2f2001837ef9fd50c"} Mar 19 17:48:15 crc kubenswrapper[4918]: I0319 17:48:15.739953 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9gcdh" podStartSLOduration=3.275369379 podStartE2EDuration="6.739935013s" podCreationTimestamp="2026-03-19 17:48:09 +0000 UTC" firstStartedPulling="2026-03-19 17:48:11.654069533 +0000 UTC m=+4103.776268791" lastFinishedPulling="2026-03-19 17:48:15.118635137 +0000 UTC m=+4107.240834425" observedRunningTime="2026-03-19 17:48:15.735595074 +0000 UTC m=+4107.857794332" watchObservedRunningTime="2026-03-19 17:48:15.739935013 +0000 UTC m=+4107.862134271" Mar 19 17:48:20 crc kubenswrapper[4918]: I0319 17:48:20.389754 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:20 crc kubenswrapper[4918]: I0319 17:48:20.390671 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:20 crc kubenswrapper[4918]: I0319 17:48:20.463908 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:20 crc kubenswrapper[4918]: I0319 17:48:20.860074 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:20 crc kubenswrapper[4918]: I0319 17:48:20.938939 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9gcdh"] Mar 19 17:48:22 crc kubenswrapper[4918]: I0319 17:48:22.787415 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9gcdh" podUID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" containerName="registry-server" containerID="cri-o://990e743bf4993e45a7a68d4c44f6aeb9c01be47e2be22ac2f2001837ef9fd50c" gracePeriod=2 Mar 19 17:48:23 crc kubenswrapper[4918]: I0319 17:48:23.801013 4918 generic.go:334] "Generic (PLEG): container finished" podID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" containerID="990e743bf4993e45a7a68d4c44f6aeb9c01be47e2be22ac2f2001837ef9fd50c" exitCode=0 Mar 19 17:48:23 crc kubenswrapper[4918]: I0319 17:48:23.801394 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gcdh" event={"ID":"1b613460-d25d-4895-87ef-2f1ba9c22bf9","Type":"ContainerDied","Data":"990e743bf4993e45a7a68d4c44f6aeb9c01be47e2be22ac2f2001837ef9fd50c"} Mar 19 17:48:23 crc kubenswrapper[4918]: I0319 17:48:23.916841 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.047049 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-utilities\") pod \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.047118 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-catalog-content\") pod \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.047189 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c29kr\" (UniqueName: \"kubernetes.io/projected/1b613460-d25d-4895-87ef-2f1ba9c22bf9-kube-api-access-c29kr\") pod \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\" (UID: \"1b613460-d25d-4895-87ef-2f1ba9c22bf9\") " Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.048701 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-utilities" (OuterVolumeSpecName: "utilities") pod "1b613460-d25d-4895-87ef-2f1ba9c22bf9" (UID: "1b613460-d25d-4895-87ef-2f1ba9c22bf9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.066023 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b613460-d25d-4895-87ef-2f1ba9c22bf9-kube-api-access-c29kr" (OuterVolumeSpecName: "kube-api-access-c29kr") pod "1b613460-d25d-4895-87ef-2f1ba9c22bf9" (UID: "1b613460-d25d-4895-87ef-2f1ba9c22bf9"). InnerVolumeSpecName "kube-api-access-c29kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.111964 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b613460-d25d-4895-87ef-2f1ba9c22bf9" (UID: "1b613460-d25d-4895-87ef-2f1ba9c22bf9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.150346 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.150381 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b613460-d25d-4895-87ef-2f1ba9c22bf9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.150392 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c29kr\" (UniqueName: \"kubernetes.io/projected/1b613460-d25d-4895-87ef-2f1ba9c22bf9-kube-api-access-c29kr\") on node \"crc\" DevicePath \"\"" Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.818249 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gcdh" event={"ID":"1b613460-d25d-4895-87ef-2f1ba9c22bf9","Type":"ContainerDied","Data":"76a2a1b45f181f30a6eeb1e1a89d40946a2b319241ba26527c39352f5eeb1962"} Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.818319 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9gcdh" Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.818673 4918 scope.go:117] "RemoveContainer" containerID="990e743bf4993e45a7a68d4c44f6aeb9c01be47e2be22ac2f2001837ef9fd50c" Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.846539 4918 scope.go:117] "RemoveContainer" containerID="771eedcd8a8d813ce5d294fbc1d66f99313597770acfaaf2e3c150f7300aa18a" Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.854942 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9gcdh"] Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.868719 4918 scope.go:117] "RemoveContainer" containerID="6d3c7d4dba844a97c5b03fb1eaa8a6ef0cd68077463c66c74b16da784e7f5e50" Mar 19 17:48:24 crc kubenswrapper[4918]: I0319 17:48:24.874972 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9gcdh"] Mar 19 17:48:26 crc kubenswrapper[4918]: I0319 17:48:26.607322 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" path="/var/lib/kubelet/pods/1b613460-d25d-4895-87ef-2f1ba9c22bf9/volumes" Mar 19 17:48:28 crc kubenswrapper[4918]: I0319 17:48:28.212511 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:48:28 crc kubenswrapper[4918]: I0319 17:48:28.213048 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:48:31 crc kubenswrapper[4918]: I0319 17:48:31.507718 4918 trace.go:236] Trace[431183293]: "Calculate volume metrics of swift for pod openstack/swift-storage-0" (19-Mar-2026 17:48:30.314) (total time: 1193ms): Mar 19 17:48:31 crc kubenswrapper[4918]: Trace[431183293]: [1.193308878s] [1.193308878s] END Mar 19 17:48:58 crc kubenswrapper[4918]: I0319 17:48:58.211965 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:48:58 crc kubenswrapper[4918]: I0319 17:48:58.212790 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:49:28 crc kubenswrapper[4918]: I0319 17:49:28.211681 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:49:28 crc kubenswrapper[4918]: I0319 17:49:28.213732 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:49:28 crc kubenswrapper[4918]: I0319 17:49:28.214199 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 17:49:28 crc kubenswrapper[4918]: I0319 17:49:28.215775 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3bea884c178a9044a006f9f35f9e71bc3b7822aaa4688753f27be06ee609e25e"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 17:49:28 crc kubenswrapper[4918]: I0319 17:49:28.215915 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://3bea884c178a9044a006f9f35f9e71bc3b7822aaa4688753f27be06ee609e25e" gracePeriod=600 Mar 19 17:49:28 crc kubenswrapper[4918]: I0319 17:49:28.651113 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="3bea884c178a9044a006f9f35f9e71bc3b7822aaa4688753f27be06ee609e25e" exitCode=0 Mar 19 17:49:28 crc kubenswrapper[4918]: I0319 17:49:28.651221 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"3bea884c178a9044a006f9f35f9e71bc3b7822aaa4688753f27be06ee609e25e"} Mar 19 17:49:28 crc kubenswrapper[4918]: I0319 17:49:28.651572 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3"} Mar 19 17:49:28 crc kubenswrapper[4918]: I0319 17:49:28.651596 4918 scope.go:117] "RemoveContainer" containerID="9350dbfd0f9f15f9a574e97ce7be1787e3bea60e463cd3bd8e7d95e03d45113e" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.176575 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565710-skpr2"] Mar 19 17:50:00 crc kubenswrapper[4918]: E0319 17:50:00.177509 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" containerName="registry-server" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.177543 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" containerName="registry-server" Mar 19 17:50:00 crc kubenswrapper[4918]: E0319 17:50:00.177590 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" containerName="extract-content" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.177598 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" containerName="extract-content" Mar 19 17:50:00 crc kubenswrapper[4918]: E0319 17:50:00.177621 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" containerName="extract-utilities" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.177630 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" containerName="extract-utilities" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.177872 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b613460-d25d-4895-87ef-2f1ba9c22bf9" containerName="registry-server" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.178780 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565710-skpr2" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.181076 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.181127 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.182291 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.192463 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565710-skpr2"] Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.358324 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svptb\" (UniqueName: \"kubernetes.io/projected/93ea78e4-d96a-4487-893e-1a2dcc8c5fe0-kube-api-access-svptb\") pod \"auto-csr-approver-29565710-skpr2\" (UID: \"93ea78e4-d96a-4487-893e-1a2dcc8c5fe0\") " pod="openshift-infra/auto-csr-approver-29565710-skpr2" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.459968 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svptb\" (UniqueName: \"kubernetes.io/projected/93ea78e4-d96a-4487-893e-1a2dcc8c5fe0-kube-api-access-svptb\") pod \"auto-csr-approver-29565710-skpr2\" (UID: \"93ea78e4-d96a-4487-893e-1a2dcc8c5fe0\") " pod="openshift-infra/auto-csr-approver-29565710-skpr2" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.481643 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svptb\" (UniqueName: \"kubernetes.io/projected/93ea78e4-d96a-4487-893e-1a2dcc8c5fe0-kube-api-access-svptb\") pod \"auto-csr-approver-29565710-skpr2\" (UID: \"93ea78e4-d96a-4487-893e-1a2dcc8c5fe0\") " pod="openshift-infra/auto-csr-approver-29565710-skpr2" Mar 19 17:50:00 crc kubenswrapper[4918]: I0319 17:50:00.509283 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565710-skpr2" Mar 19 17:50:01 crc kubenswrapper[4918]: I0319 17:50:01.005744 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565710-skpr2"] Mar 19 17:50:01 crc kubenswrapper[4918]: W0319 17:50:01.016366 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93ea78e4_d96a_4487_893e_1a2dcc8c5fe0.slice/crio-9508e6a30023ab27a114e55f4cb11c18915aa783b5b6bbb43d198528231bfc31 WatchSource:0}: Error finding container 9508e6a30023ab27a114e55f4cb11c18915aa783b5b6bbb43d198528231bfc31: Status 404 returned error can't find the container with id 9508e6a30023ab27a114e55f4cb11c18915aa783b5b6bbb43d198528231bfc31 Mar 19 17:50:01 crc kubenswrapper[4918]: I0319 17:50:01.080384 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565710-skpr2" event={"ID":"93ea78e4-d96a-4487-893e-1a2dcc8c5fe0","Type":"ContainerStarted","Data":"9508e6a30023ab27a114e55f4cb11c18915aa783b5b6bbb43d198528231bfc31"} Mar 19 17:50:03 crc kubenswrapper[4918]: I0319 17:50:03.108000 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565710-skpr2" event={"ID":"93ea78e4-d96a-4487-893e-1a2dcc8c5fe0","Type":"ContainerStarted","Data":"e4b187e93ab45989c128c3451c9f38867130a064d968a1e7c68a4d6446079a8d"} Mar 19 17:50:03 crc kubenswrapper[4918]: I0319 17:50:03.128156 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565710-skpr2" podStartSLOduration=1.54620114 podStartE2EDuration="3.128137335s" podCreationTimestamp="2026-03-19 17:50:00 +0000 UTC" firstStartedPulling="2026-03-19 17:50:01.019364584 +0000 UTC m=+4213.141563832" lastFinishedPulling="2026-03-19 17:50:02.601300769 +0000 UTC m=+4214.723500027" observedRunningTime="2026-03-19 17:50:03.122791958 +0000 UTC m=+4215.244991206" watchObservedRunningTime="2026-03-19 17:50:03.128137335 +0000 UTC m=+4215.250336583" Mar 19 17:50:04 crc kubenswrapper[4918]: I0319 17:50:04.134769 4918 generic.go:334] "Generic (PLEG): container finished" podID="93ea78e4-d96a-4487-893e-1a2dcc8c5fe0" containerID="e4b187e93ab45989c128c3451c9f38867130a064d968a1e7c68a4d6446079a8d" exitCode=0 Mar 19 17:50:04 crc kubenswrapper[4918]: I0319 17:50:04.135083 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565710-skpr2" event={"ID":"93ea78e4-d96a-4487-893e-1a2dcc8c5fe0","Type":"ContainerDied","Data":"e4b187e93ab45989c128c3451c9f38867130a064d968a1e7c68a4d6446079a8d"} Mar 19 17:50:05 crc kubenswrapper[4918]: I0319 17:50:05.569499 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565710-skpr2" Mar 19 17:50:05 crc kubenswrapper[4918]: I0319 17:50:05.682353 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svptb\" (UniqueName: \"kubernetes.io/projected/93ea78e4-d96a-4487-893e-1a2dcc8c5fe0-kube-api-access-svptb\") pod \"93ea78e4-d96a-4487-893e-1a2dcc8c5fe0\" (UID: \"93ea78e4-d96a-4487-893e-1a2dcc8c5fe0\") " Mar 19 17:50:05 crc kubenswrapper[4918]: I0319 17:50:05.689744 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ea78e4-d96a-4487-893e-1a2dcc8c5fe0-kube-api-access-svptb" (OuterVolumeSpecName: "kube-api-access-svptb") pod "93ea78e4-d96a-4487-893e-1a2dcc8c5fe0" (UID: "93ea78e4-d96a-4487-893e-1a2dcc8c5fe0"). InnerVolumeSpecName "kube-api-access-svptb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:50:05 crc kubenswrapper[4918]: I0319 17:50:05.784458 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svptb\" (UniqueName: \"kubernetes.io/projected/93ea78e4-d96a-4487-893e-1a2dcc8c5fe0-kube-api-access-svptb\") on node \"crc\" DevicePath \"\"" Mar 19 17:50:06 crc kubenswrapper[4918]: I0319 17:50:06.160211 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565710-skpr2" event={"ID":"93ea78e4-d96a-4487-893e-1a2dcc8c5fe0","Type":"ContainerDied","Data":"9508e6a30023ab27a114e55f4cb11c18915aa783b5b6bbb43d198528231bfc31"} Mar 19 17:50:06 crc kubenswrapper[4918]: I0319 17:50:06.160592 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9508e6a30023ab27a114e55f4cb11c18915aa783b5b6bbb43d198528231bfc31" Mar 19 17:50:06 crc kubenswrapper[4918]: I0319 17:50:06.160288 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565710-skpr2" Mar 19 17:50:06 crc kubenswrapper[4918]: I0319 17:50:06.223516 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565704-b999m"] Mar 19 17:50:06 crc kubenswrapper[4918]: I0319 17:50:06.235583 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565704-b999m"] Mar 19 17:50:06 crc kubenswrapper[4918]: I0319 17:50:06.604945 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7376d4b2-418d-4217-bd64-a46ff8030af0" path="/var/lib/kubelet/pods/7376d4b2-418d-4217-bd64-a46ff8030af0/volumes" Mar 19 17:50:07 crc kubenswrapper[4918]: I0319 17:50:07.908063 4918 scope.go:117] "RemoveContainer" containerID="2e64c96800e312884ac45848710462a078171eb84b0a130a3aea5e4c9a5e77d9" Mar 19 17:51:28 crc kubenswrapper[4918]: I0319 17:51:28.211626 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:51:28 crc kubenswrapper[4918]: I0319 17:51:28.212149 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:51:58 crc kubenswrapper[4918]: I0319 17:51:58.212577 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:51:58 crc kubenswrapper[4918]: I0319 17:51:58.213330 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.158548 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565712-6mmf7"] Mar 19 17:52:00 crc kubenswrapper[4918]: E0319 17:52:00.160040 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ea78e4-d96a-4487-893e-1a2dcc8c5fe0" containerName="oc" Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.160078 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ea78e4-d96a-4487-893e-1a2dcc8c5fe0" containerName="oc" Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.160726 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ea78e4-d96a-4487-893e-1a2dcc8c5fe0" containerName="oc" Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.162613 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565712-6mmf7" Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.165782 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.167133 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.167754 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.171156 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565712-6mmf7"] Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.199582 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjrs\" (UniqueName: \"kubernetes.io/projected/d930d5a5-8b98-4d00-91b1-9010cd70d92a-kube-api-access-cmjrs\") pod \"auto-csr-approver-29565712-6mmf7\" (UID: \"d930d5a5-8b98-4d00-91b1-9010cd70d92a\") " pod="openshift-infra/auto-csr-approver-29565712-6mmf7" Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.301615 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjrs\" (UniqueName: \"kubernetes.io/projected/d930d5a5-8b98-4d00-91b1-9010cd70d92a-kube-api-access-cmjrs\") pod \"auto-csr-approver-29565712-6mmf7\" (UID: \"d930d5a5-8b98-4d00-91b1-9010cd70d92a\") " pod="openshift-infra/auto-csr-approver-29565712-6mmf7" Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.331512 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjrs\" (UniqueName: \"kubernetes.io/projected/d930d5a5-8b98-4d00-91b1-9010cd70d92a-kube-api-access-cmjrs\") pod \"auto-csr-approver-29565712-6mmf7\" (UID: \"d930d5a5-8b98-4d00-91b1-9010cd70d92a\") " pod="openshift-infra/auto-csr-approver-29565712-6mmf7" Mar 19 17:52:00 crc kubenswrapper[4918]: I0319 17:52:00.491816 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565712-6mmf7" Mar 19 17:52:01 crc kubenswrapper[4918]: I0319 17:52:01.047838 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565712-6mmf7"] Mar 19 17:52:01 crc kubenswrapper[4918]: W0319 17:52:01.054845 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd930d5a5_8b98_4d00_91b1_9010cd70d92a.slice/crio-155e5450cf8bb6abbabac5944ed2c8412a0ca91e419326486717b5712b4cb974 WatchSource:0}: Error finding container 155e5450cf8bb6abbabac5944ed2c8412a0ca91e419326486717b5712b4cb974: Status 404 returned error can't find the container with id 155e5450cf8bb6abbabac5944ed2c8412a0ca91e419326486717b5712b4cb974 Mar 19 17:52:01 crc kubenswrapper[4918]: I0319 17:52:01.629950 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565712-6mmf7" event={"ID":"d930d5a5-8b98-4d00-91b1-9010cd70d92a","Type":"ContainerStarted","Data":"155e5450cf8bb6abbabac5944ed2c8412a0ca91e419326486717b5712b4cb974"} Mar 19 17:52:03 crc kubenswrapper[4918]: I0319 17:52:03.669159 4918 generic.go:334] "Generic (PLEG): container finished" podID="d930d5a5-8b98-4d00-91b1-9010cd70d92a" containerID="fcc836b9043aa89cdd0e32658af25010ba8be06578bd924576762e8f074b787f" exitCode=0 Mar 19 17:52:03 crc kubenswrapper[4918]: I0319 17:52:03.669240 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565712-6mmf7" event={"ID":"d930d5a5-8b98-4d00-91b1-9010cd70d92a","Type":"ContainerDied","Data":"fcc836b9043aa89cdd0e32658af25010ba8be06578bd924576762e8f074b787f"} Mar 19 17:52:05 crc kubenswrapper[4918]: I0319 17:52:05.206089 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565712-6mmf7" Mar 19 17:52:05 crc kubenswrapper[4918]: I0319 17:52:05.229265 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmjrs\" (UniqueName: \"kubernetes.io/projected/d930d5a5-8b98-4d00-91b1-9010cd70d92a-kube-api-access-cmjrs\") pod \"d930d5a5-8b98-4d00-91b1-9010cd70d92a\" (UID: \"d930d5a5-8b98-4d00-91b1-9010cd70d92a\") " Mar 19 17:52:05 crc kubenswrapper[4918]: I0319 17:52:05.235872 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d930d5a5-8b98-4d00-91b1-9010cd70d92a-kube-api-access-cmjrs" (OuterVolumeSpecName: "kube-api-access-cmjrs") pod "d930d5a5-8b98-4d00-91b1-9010cd70d92a" (UID: "d930d5a5-8b98-4d00-91b1-9010cd70d92a"). InnerVolumeSpecName "kube-api-access-cmjrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:52:05 crc kubenswrapper[4918]: I0319 17:52:05.333175 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmjrs\" (UniqueName: \"kubernetes.io/projected/d930d5a5-8b98-4d00-91b1-9010cd70d92a-kube-api-access-cmjrs\") on node \"crc\" DevicePath \"\"" Mar 19 17:52:05 crc kubenswrapper[4918]: I0319 17:52:05.704724 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565712-6mmf7" event={"ID":"d930d5a5-8b98-4d00-91b1-9010cd70d92a","Type":"ContainerDied","Data":"155e5450cf8bb6abbabac5944ed2c8412a0ca91e419326486717b5712b4cb974"} Mar 19 17:52:05 crc kubenswrapper[4918]: I0319 17:52:05.705206 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="155e5450cf8bb6abbabac5944ed2c8412a0ca91e419326486717b5712b4cb974" Mar 19 17:52:05 crc kubenswrapper[4918]: I0319 17:52:05.704904 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565712-6mmf7" Mar 19 17:52:06 crc kubenswrapper[4918]: I0319 17:52:06.302953 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565706-glw7c"] Mar 19 17:52:06 crc kubenswrapper[4918]: I0319 17:52:06.319656 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565706-glw7c"] Mar 19 17:52:06 crc kubenswrapper[4918]: I0319 17:52:06.597207 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba98ddb7-0e6c-44f9-b203-b691d84af9fd" path="/var/lib/kubelet/pods/ba98ddb7-0e6c-44f9-b203-b691d84af9fd/volumes" Mar 19 17:52:08 crc kubenswrapper[4918]: I0319 17:52:08.046191 4918 scope.go:117] "RemoveContainer" containerID="c42e0672631f620869a39334b8075c0886a01bf6d5353f3b356bf7b8fd7d280c" Mar 19 17:52:28 crc kubenswrapper[4918]: I0319 17:52:28.212066 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:52:28 crc kubenswrapper[4918]: I0319 17:52:28.212927 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 17:52:28 crc kubenswrapper[4918]: I0319 17:52:28.213007 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 17:52:28 crc kubenswrapper[4918]: I0319 17:52:28.214384 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 17:52:28 crc kubenswrapper[4918]: I0319 17:52:28.214566 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" gracePeriod=600 Mar 19 17:52:28 crc kubenswrapper[4918]: E0319 17:52:28.735812 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:52:28 crc kubenswrapper[4918]: I0319 17:52:28.989075 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" exitCode=0 Mar 19 17:52:28 crc kubenswrapper[4918]: I0319 17:52:28.989124 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3"} Mar 19 17:52:28 crc kubenswrapper[4918]: I0319 17:52:28.989161 4918 scope.go:117] "RemoveContainer" containerID="3bea884c178a9044a006f9f35f9e71bc3b7822aaa4688753f27be06ee609e25e" Mar 19 17:52:28 crc kubenswrapper[4918]: I0319 17:52:28.990688 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:52:28 crc kubenswrapper[4918]: E0319 17:52:28.991512 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:52:44 crc kubenswrapper[4918]: I0319 17:52:44.586259 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:52:44 crc kubenswrapper[4918]: E0319 17:52:44.586961 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.309567 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6v9vr"] Mar 19 17:52:48 crc kubenswrapper[4918]: E0319 17:52:48.310678 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d930d5a5-8b98-4d00-91b1-9010cd70d92a" containerName="oc" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.310694 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d930d5a5-8b98-4d00-91b1-9010cd70d92a" containerName="oc" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.310965 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d930d5a5-8b98-4d00-91b1-9010cd70d92a" containerName="oc" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.314257 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.359823 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6v9vr"] Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.416247 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtjxg\" (UniqueName: \"kubernetes.io/projected/66d4baeb-b874-4b85-acde-ad0e33f846aa-kube-api-access-rtjxg\") pod \"redhat-operators-6v9vr\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.416618 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-catalog-content\") pod \"redhat-operators-6v9vr\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.417033 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-utilities\") pod \"redhat-operators-6v9vr\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.519650 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-catalog-content\") pod \"redhat-operators-6v9vr\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.519783 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-utilities\") pod \"redhat-operators-6v9vr\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.519917 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtjxg\" (UniqueName: \"kubernetes.io/projected/66d4baeb-b874-4b85-acde-ad0e33f846aa-kube-api-access-rtjxg\") pod \"redhat-operators-6v9vr\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.520461 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-utilities\") pod \"redhat-operators-6v9vr\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.520470 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-catalog-content\") pod \"redhat-operators-6v9vr\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.541125 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtjxg\" (UniqueName: \"kubernetes.io/projected/66d4baeb-b874-4b85-acde-ad0e33f846aa-kube-api-access-rtjxg\") pod \"redhat-operators-6v9vr\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:52:48 crc kubenswrapper[4918]: I0319 17:52:48.667606 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:52:49 crc kubenswrapper[4918]: I0319 17:52:49.154374 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6v9vr"] Mar 19 17:52:49 crc kubenswrapper[4918]: I0319 17:52:49.229685 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v9vr" event={"ID":"66d4baeb-b874-4b85-acde-ad0e33f846aa","Type":"ContainerStarted","Data":"c764ec1f566f70c6bb95c93c62d79e4509c3e8387d9154958e2b54ba5bdf2598"} Mar 19 17:52:50 crc kubenswrapper[4918]: I0319 17:52:50.242458 4918 generic.go:334] "Generic (PLEG): container finished" podID="66d4baeb-b874-4b85-acde-ad0e33f846aa" containerID="704654a13c5bedf93f72ea01a0ee629d1becfda61bb418d945297abddaafc58f" exitCode=0 Mar 19 17:52:50 crc kubenswrapper[4918]: I0319 17:52:50.242550 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v9vr" event={"ID":"66d4baeb-b874-4b85-acde-ad0e33f846aa","Type":"ContainerDied","Data":"704654a13c5bedf93f72ea01a0ee629d1becfda61bb418d945297abddaafc58f"} Mar 19 17:52:52 crc kubenswrapper[4918]: I0319 17:52:52.262282 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v9vr" event={"ID":"66d4baeb-b874-4b85-acde-ad0e33f846aa","Type":"ContainerStarted","Data":"139adff8ce0a9dff070b9638dedc510ba6b25a30843af1d8c891e6e3fbfa8fc4"} Mar 19 17:52:59 crc kubenswrapper[4918]: I0319 17:52:59.006206 4918 generic.go:334] "Generic (PLEG): container finished" podID="66d4baeb-b874-4b85-acde-ad0e33f846aa" containerID="139adff8ce0a9dff070b9638dedc510ba6b25a30843af1d8c891e6e3fbfa8fc4" exitCode=0 Mar 19 17:52:59 crc kubenswrapper[4918]: I0319 17:52:59.007606 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:52:59 crc kubenswrapper[4918]: E0319 17:52:59.007873 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:52:59 crc kubenswrapper[4918]: I0319 17:52:59.016236 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v9vr" event={"ID":"66d4baeb-b874-4b85-acde-ad0e33f846aa","Type":"ContainerDied","Data":"139adff8ce0a9dff070b9638dedc510ba6b25a30843af1d8c891e6e3fbfa8fc4"} Mar 19 17:53:01 crc kubenswrapper[4918]: I0319 17:53:01.039514 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v9vr" event={"ID":"66d4baeb-b874-4b85-acde-ad0e33f846aa","Type":"ContainerStarted","Data":"072817654cdc507556d40d7f2ecaf47d9e669bac46d620269838300783c0d3da"} Mar 19 17:53:01 crc kubenswrapper[4918]: I0319 17:53:01.070630 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6v9vr" podStartSLOduration=3.829487161 podStartE2EDuration="13.07061047s" podCreationTimestamp="2026-03-19 17:52:48 +0000 UTC" firstStartedPulling="2026-03-19 17:52:50.245274285 +0000 UTC m=+4382.367473543" lastFinishedPulling="2026-03-19 17:52:59.486397594 +0000 UTC m=+4391.608596852" observedRunningTime="2026-03-19 17:53:01.064433991 +0000 UTC m=+4393.186633249" watchObservedRunningTime="2026-03-19 17:53:01.07061047 +0000 UTC m=+4393.192809718" Mar 19 17:53:08 crc kubenswrapper[4918]: I0319 17:53:08.668367 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:53:08 crc kubenswrapper[4918]: I0319 17:53:08.670731 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:53:08 crc kubenswrapper[4918]: I0319 17:53:08.731780 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:53:09 crc kubenswrapper[4918]: I0319 17:53:09.185231 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:53:09 crc kubenswrapper[4918]: I0319 17:53:09.271565 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6v9vr"] Mar 19 17:53:10 crc kubenswrapper[4918]: I0319 17:53:10.587337 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:53:10 crc kubenswrapper[4918]: E0319 17:53:10.588133 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:53:11 crc kubenswrapper[4918]: I0319 17:53:11.137149 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6v9vr" podUID="66d4baeb-b874-4b85-acde-ad0e33f846aa" containerName="registry-server" containerID="cri-o://072817654cdc507556d40d7f2ecaf47d9e669bac46d620269838300783c0d3da" gracePeriod=2 Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.150021 4918 generic.go:334] "Generic (PLEG): container finished" podID="66d4baeb-b874-4b85-acde-ad0e33f846aa" containerID="072817654cdc507556d40d7f2ecaf47d9e669bac46d620269838300783c0d3da" exitCode=0 Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.150124 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v9vr" event={"ID":"66d4baeb-b874-4b85-acde-ad0e33f846aa","Type":"ContainerDied","Data":"072817654cdc507556d40d7f2ecaf47d9e669bac46d620269838300783c0d3da"} Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.460288 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.591258 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-utilities\") pod \"66d4baeb-b874-4b85-acde-ad0e33f846aa\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.591333 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-catalog-content\") pod \"66d4baeb-b874-4b85-acde-ad0e33f846aa\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.591542 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtjxg\" (UniqueName: \"kubernetes.io/projected/66d4baeb-b874-4b85-acde-ad0e33f846aa-kube-api-access-rtjxg\") pod \"66d4baeb-b874-4b85-acde-ad0e33f846aa\" (UID: \"66d4baeb-b874-4b85-acde-ad0e33f846aa\") " Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.592305 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-utilities" (OuterVolumeSpecName: "utilities") pod "66d4baeb-b874-4b85-acde-ad0e33f846aa" (UID: "66d4baeb-b874-4b85-acde-ad0e33f846aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.609048 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d4baeb-b874-4b85-acde-ad0e33f846aa-kube-api-access-rtjxg" (OuterVolumeSpecName: "kube-api-access-rtjxg") pod "66d4baeb-b874-4b85-acde-ad0e33f846aa" (UID: "66d4baeb-b874-4b85-acde-ad0e33f846aa"). InnerVolumeSpecName "kube-api-access-rtjxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.694281 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtjxg\" (UniqueName: \"kubernetes.io/projected/66d4baeb-b874-4b85-acde-ad0e33f846aa-kube-api-access-rtjxg\") on node \"crc\" DevicePath \"\"" Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.694966 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.792260 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66d4baeb-b874-4b85-acde-ad0e33f846aa" (UID: "66d4baeb-b874-4b85-acde-ad0e33f846aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:53:12 crc kubenswrapper[4918]: I0319 17:53:12.796391 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d4baeb-b874-4b85-acde-ad0e33f846aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:53:13 crc kubenswrapper[4918]: I0319 17:53:13.167171 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6v9vr" event={"ID":"66d4baeb-b874-4b85-acde-ad0e33f846aa","Type":"ContainerDied","Data":"c764ec1f566f70c6bb95c93c62d79e4509c3e8387d9154958e2b54ba5bdf2598"} Mar 19 17:53:13 crc kubenswrapper[4918]: I0319 17:53:13.167257 4918 scope.go:117] "RemoveContainer" containerID="072817654cdc507556d40d7f2ecaf47d9e669bac46d620269838300783c0d3da" Mar 19 17:53:13 crc kubenswrapper[4918]: I0319 17:53:13.167260 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6v9vr" Mar 19 17:53:13 crc kubenswrapper[4918]: I0319 17:53:13.213713 4918 scope.go:117] "RemoveContainer" containerID="139adff8ce0a9dff070b9638dedc510ba6b25a30843af1d8c891e6e3fbfa8fc4" Mar 19 17:53:13 crc kubenswrapper[4918]: I0319 17:53:13.223161 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6v9vr"] Mar 19 17:53:13 crc kubenswrapper[4918]: I0319 17:53:13.238717 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6v9vr"] Mar 19 17:53:14 crc kubenswrapper[4918]: I0319 17:53:14.136725 4918 scope.go:117] "RemoveContainer" containerID="704654a13c5bedf93f72ea01a0ee629d1becfda61bb418d945297abddaafc58f" Mar 19 17:53:14 crc kubenswrapper[4918]: I0319 17:53:14.608209 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d4baeb-b874-4b85-acde-ad0e33f846aa" path="/var/lib/kubelet/pods/66d4baeb-b874-4b85-acde-ad0e33f846aa/volumes" Mar 19 17:53:22 crc kubenswrapper[4918]: I0319 17:53:22.587668 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:53:22 crc kubenswrapper[4918]: E0319 17:53:22.588310 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:53:35 crc kubenswrapper[4918]: I0319 17:53:35.586438 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:53:35 crc kubenswrapper[4918]: E0319 17:53:35.587194 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:53:47 crc kubenswrapper[4918]: I0319 17:53:47.586823 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:53:47 crc kubenswrapper[4918]: E0319 17:53:47.588047 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.641010 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 17:53:48 crc kubenswrapper[4918]: E0319 17:53:48.641932 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d4baeb-b874-4b85-acde-ad0e33f846aa" containerName="extract-content" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.641945 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d4baeb-b874-4b85-acde-ad0e33f846aa" containerName="extract-content" Mar 19 17:53:48 crc kubenswrapper[4918]: E0319 17:53:48.641969 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d4baeb-b874-4b85-acde-ad0e33f846aa" containerName="extract-utilities" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.641975 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d4baeb-b874-4b85-acde-ad0e33f846aa" containerName="extract-utilities" Mar 19 17:53:48 crc kubenswrapper[4918]: E0319 17:53:48.642000 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d4baeb-b874-4b85-acde-ad0e33f846aa" containerName="registry-server" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.642008 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d4baeb-b874-4b85-acde-ad0e33f846aa" containerName="registry-server" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.642331 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d4baeb-b874-4b85-acde-ad0e33f846aa" containerName="registry-server" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.643193 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.645897 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.646216 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.646740 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cbnk9" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.647016 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.660171 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.738063 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.738117 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-config-data\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.738176 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.738217 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwgtk\" (UniqueName: \"kubernetes.io/projected/26f468b4-9955-436c-810a-cff9e17a1063-kube-api-access-rwgtk\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.738271 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.738313 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.738591 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.738812 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.738970 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.840764 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.840839 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.840862 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-config-data\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.840889 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.840918 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwgtk\" (UniqueName: \"kubernetes.io/projected/26f468b4-9955-436c-810a-cff9e17a1063-kube-api-access-rwgtk\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.840957 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.841106 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.841340 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.841558 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.842196 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-config-data\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.842453 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.842633 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.842978 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.843047 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.848853 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.854157 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.854341 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.858212 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwgtk\" (UniqueName: \"kubernetes.io/projected/26f468b4-9955-436c-810a-cff9e17a1063-kube-api-access-rwgtk\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.875702 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " pod="openstack/tempest-tests-tempest" Mar 19 17:53:48 crc kubenswrapper[4918]: I0319 17:53:48.969103 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 17:53:49 crc kubenswrapper[4918]: I0319 17:53:49.441071 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 17:53:49 crc kubenswrapper[4918]: I0319 17:53:49.450786 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 17:53:49 crc kubenswrapper[4918]: I0319 17:53:49.602223 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"26f468b4-9955-436c-810a-cff9e17a1063","Type":"ContainerStarted","Data":"70363227ac59cd1e4a9347a98b92241cb8a471b3fb23e87285eddd2cf4528f17"} Mar 19 17:53:58 crc kubenswrapper[4918]: I0319 17:53:58.595450 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:53:58 crc kubenswrapper[4918]: E0319 17:53:58.596487 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:54:00 crc kubenswrapper[4918]: I0319 17:54:00.149409 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565714-q44t8"] Mar 19 17:54:00 crc kubenswrapper[4918]: I0319 17:54:00.151109 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565714-q44t8" Mar 19 17:54:00 crc kubenswrapper[4918]: I0319 17:54:00.153634 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:54:00 crc kubenswrapper[4918]: I0319 17:54:00.154558 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:54:00 crc kubenswrapper[4918]: I0319 17:54:00.163934 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:54:00 crc kubenswrapper[4918]: I0319 17:54:00.175289 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565714-q44t8"] Mar 19 17:54:00 crc kubenswrapper[4918]: I0319 17:54:00.216057 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw9mr\" (UniqueName: \"kubernetes.io/projected/0d084b21-34e1-4a69-ab09-1bb2778a2cf6-kube-api-access-kw9mr\") pod \"auto-csr-approver-29565714-q44t8\" (UID: \"0d084b21-34e1-4a69-ab09-1bb2778a2cf6\") " pod="openshift-infra/auto-csr-approver-29565714-q44t8" Mar 19 17:54:00 crc kubenswrapper[4918]: I0319 17:54:00.318079 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw9mr\" (UniqueName: \"kubernetes.io/projected/0d084b21-34e1-4a69-ab09-1bb2778a2cf6-kube-api-access-kw9mr\") pod \"auto-csr-approver-29565714-q44t8\" (UID: \"0d084b21-34e1-4a69-ab09-1bb2778a2cf6\") " pod="openshift-infra/auto-csr-approver-29565714-q44t8" Mar 19 17:54:00 crc kubenswrapper[4918]: I0319 17:54:00.345306 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw9mr\" (UniqueName: \"kubernetes.io/projected/0d084b21-34e1-4a69-ab09-1bb2778a2cf6-kube-api-access-kw9mr\") pod \"auto-csr-approver-29565714-q44t8\" (UID: \"0d084b21-34e1-4a69-ab09-1bb2778a2cf6\") " pod="openshift-infra/auto-csr-approver-29565714-q44t8" Mar 19 17:54:00 crc kubenswrapper[4918]: I0319 17:54:00.480407 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565714-q44t8" Mar 19 17:54:10 crc kubenswrapper[4918]: I0319 17:54:10.588469 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:54:10 crc kubenswrapper[4918]: E0319 17:54:10.589323 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:54:15 crc kubenswrapper[4918]: E0319 17:54:15.553183 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 19 17:54:15 crc kubenswrapper[4918]: E0319 17:54:15.554025 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rwgtk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(26f468b4-9955-436c-810a-cff9e17a1063): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 17:54:15 crc kubenswrapper[4918]: E0319 17:54:15.555596 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="26f468b4-9955-436c-810a-cff9e17a1063" Mar 19 17:54:15 crc kubenswrapper[4918]: E0319 17:54:15.872452 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="26f468b4-9955-436c-810a-cff9e17a1063" Mar 19 17:54:15 crc kubenswrapper[4918]: I0319 17:54:15.984726 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565714-q44t8"] Mar 19 17:54:16 crc kubenswrapper[4918]: I0319 17:54:16.885678 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565714-q44t8" event={"ID":"0d084b21-34e1-4a69-ab09-1bb2778a2cf6","Type":"ContainerStarted","Data":"6e10e58437331544e7366807030dcc0b524ad7f41b0488ed11938742d98eb406"} Mar 19 17:54:17 crc kubenswrapper[4918]: I0319 17:54:17.899164 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565714-q44t8" event={"ID":"0d084b21-34e1-4a69-ab09-1bb2778a2cf6","Type":"ContainerStarted","Data":"40ca7fcfedcfc6fcc7542b39ec7973432000929fafe730fdcbf3684185baef40"} Mar 19 17:54:17 crc kubenswrapper[4918]: I0319 17:54:17.922204 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565714-q44t8" podStartSLOduration=16.964714995 podStartE2EDuration="17.922179018s" podCreationTimestamp="2026-03-19 17:54:00 +0000 UTC" firstStartedPulling="2026-03-19 17:54:15.986259378 +0000 UTC m=+4468.108458656" lastFinishedPulling="2026-03-19 17:54:16.943723421 +0000 UTC m=+4469.065922679" observedRunningTime="2026-03-19 17:54:17.914015785 +0000 UTC m=+4470.036215103" watchObservedRunningTime="2026-03-19 17:54:17.922179018 +0000 UTC m=+4470.044378276" Mar 19 17:54:18 crc kubenswrapper[4918]: I0319 17:54:18.912384 4918 generic.go:334] "Generic (PLEG): container finished" podID="0d084b21-34e1-4a69-ab09-1bb2778a2cf6" containerID="40ca7fcfedcfc6fcc7542b39ec7973432000929fafe730fdcbf3684185baef40" exitCode=0 Mar 19 17:54:18 crc kubenswrapper[4918]: I0319 17:54:18.912797 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565714-q44t8" event={"ID":"0d084b21-34e1-4a69-ab09-1bb2778a2cf6","Type":"ContainerDied","Data":"40ca7fcfedcfc6fcc7542b39ec7973432000929fafe730fdcbf3684185baef40"} Mar 19 17:54:20 crc kubenswrapper[4918]: I0319 17:54:20.323086 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565714-q44t8" Mar 19 17:54:20 crc kubenswrapper[4918]: I0319 17:54:20.488649 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw9mr\" (UniqueName: \"kubernetes.io/projected/0d084b21-34e1-4a69-ab09-1bb2778a2cf6-kube-api-access-kw9mr\") pod \"0d084b21-34e1-4a69-ab09-1bb2778a2cf6\" (UID: \"0d084b21-34e1-4a69-ab09-1bb2778a2cf6\") " Mar 19 17:54:20 crc kubenswrapper[4918]: I0319 17:54:20.496974 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d084b21-34e1-4a69-ab09-1bb2778a2cf6-kube-api-access-kw9mr" (OuterVolumeSpecName: "kube-api-access-kw9mr") pod "0d084b21-34e1-4a69-ab09-1bb2778a2cf6" (UID: "0d084b21-34e1-4a69-ab09-1bb2778a2cf6"). InnerVolumeSpecName "kube-api-access-kw9mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:54:20 crc kubenswrapper[4918]: I0319 17:54:20.592039 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw9mr\" (UniqueName: \"kubernetes.io/projected/0d084b21-34e1-4a69-ab09-1bb2778a2cf6-kube-api-access-kw9mr\") on node \"crc\" DevicePath \"\"" Mar 19 17:54:20 crc kubenswrapper[4918]: I0319 17:54:20.931327 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565714-q44t8" event={"ID":"0d084b21-34e1-4a69-ab09-1bb2778a2cf6","Type":"ContainerDied","Data":"6e10e58437331544e7366807030dcc0b524ad7f41b0488ed11938742d98eb406"} Mar 19 17:54:20 crc kubenswrapper[4918]: I0319 17:54:20.931375 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e10e58437331544e7366807030dcc0b524ad7f41b0488ed11938742d98eb406" Mar 19 17:54:20 crc kubenswrapper[4918]: I0319 17:54:20.931398 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565714-q44t8" Mar 19 17:54:20 crc kubenswrapper[4918]: I0319 17:54:20.994565 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565708-xl6gf"] Mar 19 17:54:21 crc kubenswrapper[4918]: I0319 17:54:21.005655 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565708-xl6gf"] Mar 19 17:54:22 crc kubenswrapper[4918]: I0319 17:54:22.600439 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="989e7ccb-0088-4f44-9cdf-e937bf3c99f9" path="/var/lib/kubelet/pods/989e7ccb-0088-4f44-9cdf-e937bf3c99f9/volumes" Mar 19 17:54:23 crc kubenswrapper[4918]: I0319 17:54:23.587352 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:54:23 crc kubenswrapper[4918]: E0319 17:54:23.588041 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:54:33 crc kubenswrapper[4918]: I0319 17:54:33.080891 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"26f468b4-9955-436c-810a-cff9e17a1063","Type":"ContainerStarted","Data":"91f53fc26d2d00f2d4b0da8d89d3bc9378dffb313b6837b71b186f8c75dd5881"} Mar 19 17:54:33 crc kubenswrapper[4918]: I0319 17:54:33.120472 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.344440692 podStartE2EDuration="46.120453714s" podCreationTimestamp="2026-03-19 17:53:47 +0000 UTC" firstStartedPulling="2026-03-19 17:53:49.450565015 +0000 UTC m=+4441.572764263" lastFinishedPulling="2026-03-19 17:54:31.226578027 +0000 UTC m=+4483.348777285" observedRunningTime="2026-03-19 17:54:33.110585744 +0000 UTC m=+4485.232784992" watchObservedRunningTime="2026-03-19 17:54:33.120453714 +0000 UTC m=+4485.242652972" Mar 19 17:54:34 crc kubenswrapper[4918]: I0319 17:54:34.587429 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:54:34 crc kubenswrapper[4918]: E0319 17:54:34.588075 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:54:49 crc kubenswrapper[4918]: I0319 17:54:49.587107 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:54:49 crc kubenswrapper[4918]: E0319 17:54:49.587924 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:55:00 crc kubenswrapper[4918]: I0319 17:55:00.586804 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:55:00 crc kubenswrapper[4918]: E0319 17:55:00.587751 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:55:11 crc kubenswrapper[4918]: I0319 17:55:11.587096 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:55:11 crc kubenswrapper[4918]: E0319 17:55:11.588718 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:55:15 crc kubenswrapper[4918]: I0319 17:55:15.490108 4918 scope.go:117] "RemoveContainer" containerID="df39aac482f4ccaf8bfc18be924ff795f381833e6327b1f20be4a703e54beca1" Mar 19 17:55:22 crc kubenswrapper[4918]: I0319 17:55:22.588127 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:55:22 crc kubenswrapper[4918]: E0319 17:55:22.589121 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:55:34 crc kubenswrapper[4918]: I0319 17:55:34.594015 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:55:34 crc kubenswrapper[4918]: E0319 17:55:34.594994 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:55:43 crc kubenswrapper[4918]: I0319 17:55:43.727017 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xhfsr"] Mar 19 17:55:43 crc kubenswrapper[4918]: E0319 17:55:43.729371 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d084b21-34e1-4a69-ab09-1bb2778a2cf6" containerName="oc" Mar 19 17:55:43 crc kubenswrapper[4918]: I0319 17:55:43.729498 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d084b21-34e1-4a69-ab09-1bb2778a2cf6" containerName="oc" Mar 19 17:55:43 crc kubenswrapper[4918]: I0319 17:55:43.729960 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d084b21-34e1-4a69-ab09-1bb2778a2cf6" containerName="oc" Mar 19 17:55:43 crc kubenswrapper[4918]: I0319 17:55:43.732320 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:43 crc kubenswrapper[4918]: I0319 17:55:43.790179 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xhfsr"] Mar 19 17:55:43 crc kubenswrapper[4918]: I0319 17:55:43.907819 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-utilities\") pod \"community-operators-xhfsr\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:43 crc kubenswrapper[4918]: I0319 17:55:43.907878 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vjz\" (UniqueName: \"kubernetes.io/projected/04f9b0c1-e5a0-4211-ab03-1cd87f844489-kube-api-access-44vjz\") pod \"community-operators-xhfsr\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:43 crc kubenswrapper[4918]: I0319 17:55:43.907916 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-catalog-content\") pod \"community-operators-xhfsr\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:44 crc kubenswrapper[4918]: I0319 17:55:44.009838 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-utilities\") pod \"community-operators-xhfsr\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:44 crc kubenswrapper[4918]: I0319 17:55:44.009933 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44vjz\" (UniqueName: \"kubernetes.io/projected/04f9b0c1-e5a0-4211-ab03-1cd87f844489-kube-api-access-44vjz\") pod \"community-operators-xhfsr\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:44 crc kubenswrapper[4918]: I0319 17:55:44.009990 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-catalog-content\") pod \"community-operators-xhfsr\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:44 crc kubenswrapper[4918]: I0319 17:55:44.010433 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-utilities\") pod \"community-operators-xhfsr\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:44 crc kubenswrapper[4918]: I0319 17:55:44.010642 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-catalog-content\") pod \"community-operators-xhfsr\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:44 crc kubenswrapper[4918]: I0319 17:55:44.043102 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vjz\" (UniqueName: \"kubernetes.io/projected/04f9b0c1-e5a0-4211-ab03-1cd87f844489-kube-api-access-44vjz\") pod \"community-operators-xhfsr\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:44 crc kubenswrapper[4918]: I0319 17:55:44.084168 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:44 crc kubenswrapper[4918]: I0319 17:55:44.812278 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xhfsr"] Mar 19 17:55:45 crc kubenswrapper[4918]: I0319 17:55:45.587015 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:55:45 crc kubenswrapper[4918]: E0319 17:55:45.587470 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:55:45 crc kubenswrapper[4918]: I0319 17:55:45.704311 4918 generic.go:334] "Generic (PLEG): container finished" podID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" containerID="58baae42772a91e78286608fa8512ed51b1a81b1978130e4e0d1f8da5d7582f7" exitCode=0 Mar 19 17:55:45 crc kubenswrapper[4918]: I0319 17:55:45.704357 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhfsr" event={"ID":"04f9b0c1-e5a0-4211-ab03-1cd87f844489","Type":"ContainerDied","Data":"58baae42772a91e78286608fa8512ed51b1a81b1978130e4e0d1f8da5d7582f7"} Mar 19 17:55:45 crc kubenswrapper[4918]: I0319 17:55:45.704382 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhfsr" event={"ID":"04f9b0c1-e5a0-4211-ab03-1cd87f844489","Type":"ContainerStarted","Data":"8792adacdf707ef889fb3a94c6379126a61129e91d647816252cf61d810666c6"} Mar 19 17:55:46 crc kubenswrapper[4918]: I0319 17:55:46.714988 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhfsr" event={"ID":"04f9b0c1-e5a0-4211-ab03-1cd87f844489","Type":"ContainerStarted","Data":"81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305"} Mar 19 17:55:48 crc kubenswrapper[4918]: I0319 17:55:48.735168 4918 generic.go:334] "Generic (PLEG): container finished" podID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" containerID="81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305" exitCode=0 Mar 19 17:55:48 crc kubenswrapper[4918]: I0319 17:55:48.735293 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhfsr" event={"ID":"04f9b0c1-e5a0-4211-ab03-1cd87f844489","Type":"ContainerDied","Data":"81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305"} Mar 19 17:55:49 crc kubenswrapper[4918]: I0319 17:55:49.745898 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhfsr" event={"ID":"04f9b0c1-e5a0-4211-ab03-1cd87f844489","Type":"ContainerStarted","Data":"4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5"} Mar 19 17:55:49 crc kubenswrapper[4918]: I0319 17:55:49.767103 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xhfsr" podStartSLOduration=3.344598344 podStartE2EDuration="6.767069543s" podCreationTimestamp="2026-03-19 17:55:43 +0000 UTC" firstStartedPulling="2026-03-19 17:55:45.70620481 +0000 UTC m=+4557.828404058" lastFinishedPulling="2026-03-19 17:55:49.128676009 +0000 UTC m=+4561.250875257" observedRunningTime="2026-03-19 17:55:49.764966286 +0000 UTC m=+4561.887165534" watchObservedRunningTime="2026-03-19 17:55:49.767069543 +0000 UTC m=+4561.889268791" Mar 19 17:55:54 crc kubenswrapper[4918]: I0319 17:55:54.084761 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:54 crc kubenswrapper[4918]: I0319 17:55:54.086879 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:54 crc kubenswrapper[4918]: I0319 17:55:54.153421 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:54 crc kubenswrapper[4918]: I0319 17:55:54.859714 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:54 crc kubenswrapper[4918]: I0319 17:55:54.916401 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xhfsr"] Mar 19 17:55:56 crc kubenswrapper[4918]: I0319 17:55:56.814079 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xhfsr" podUID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" containerName="registry-server" containerID="cri-o://4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5" gracePeriod=2 Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.772315 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.830315 4918 generic.go:334] "Generic (PLEG): container finished" podID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" containerID="4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5" exitCode=0 Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.830365 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhfsr" event={"ID":"04f9b0c1-e5a0-4211-ab03-1cd87f844489","Type":"ContainerDied","Data":"4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5"} Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.830381 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xhfsr" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.830396 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhfsr" event={"ID":"04f9b0c1-e5a0-4211-ab03-1cd87f844489","Type":"ContainerDied","Data":"8792adacdf707ef889fb3a94c6379126a61129e91d647816252cf61d810666c6"} Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.830417 4918 scope.go:117] "RemoveContainer" containerID="4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.852177 4918 scope.go:117] "RemoveContainer" containerID="81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.889642 4918 scope.go:117] "RemoveContainer" containerID="58baae42772a91e78286608fa8512ed51b1a81b1978130e4e0d1f8da5d7582f7" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.949407 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44vjz\" (UniqueName: \"kubernetes.io/projected/04f9b0c1-e5a0-4211-ab03-1cd87f844489-kube-api-access-44vjz\") pod \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.949499 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-catalog-content\") pod \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.949646 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-utilities\") pod \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\" (UID: \"04f9b0c1-e5a0-4211-ab03-1cd87f844489\") " Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.952921 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-utilities" (OuterVolumeSpecName: "utilities") pod "04f9b0c1-e5a0-4211-ab03-1cd87f844489" (UID: "04f9b0c1-e5a0-4211-ab03-1cd87f844489"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.955819 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.957724 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f9b0c1-e5a0-4211-ab03-1cd87f844489-kube-api-access-44vjz" (OuterVolumeSpecName: "kube-api-access-44vjz") pod "04f9b0c1-e5a0-4211-ab03-1cd87f844489" (UID: "04f9b0c1-e5a0-4211-ab03-1cd87f844489"). InnerVolumeSpecName "kube-api-access-44vjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.960747 4918 scope.go:117] "RemoveContainer" containerID="4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5" Mar 19 17:55:57 crc kubenswrapper[4918]: E0319 17:55:57.962905 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5\": container with ID starting with 4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5 not found: ID does not exist" containerID="4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.962943 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5"} err="failed to get container status \"4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5\": rpc error: code = NotFound desc = could not find container \"4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5\": container with ID starting with 4400cb0e687fd6ae024ae531200548df5b74e8f085da3c1cb8a6249a32b795c5 not found: ID does not exist" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.962970 4918 scope.go:117] "RemoveContainer" containerID="81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305" Mar 19 17:55:57 crc kubenswrapper[4918]: E0319 17:55:57.966987 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305\": container with ID starting with 81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305 not found: ID does not exist" containerID="81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.967032 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305"} err="failed to get container status \"81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305\": rpc error: code = NotFound desc = could not find container \"81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305\": container with ID starting with 81c4cb7a02dacdad54921bdecf920a81196e3d04cadff513ab7554b1e8d8c305 not found: ID does not exist" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.967056 4918 scope.go:117] "RemoveContainer" containerID="58baae42772a91e78286608fa8512ed51b1a81b1978130e4e0d1f8da5d7582f7" Mar 19 17:55:57 crc kubenswrapper[4918]: E0319 17:55:57.971467 4918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58baae42772a91e78286608fa8512ed51b1a81b1978130e4e0d1f8da5d7582f7\": container with ID starting with 58baae42772a91e78286608fa8512ed51b1a81b1978130e4e0d1f8da5d7582f7 not found: ID does not exist" containerID="58baae42772a91e78286608fa8512ed51b1a81b1978130e4e0d1f8da5d7582f7" Mar 19 17:55:57 crc kubenswrapper[4918]: I0319 17:55:57.971495 4918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58baae42772a91e78286608fa8512ed51b1a81b1978130e4e0d1f8da5d7582f7"} err="failed to get container status \"58baae42772a91e78286608fa8512ed51b1a81b1978130e4e0d1f8da5d7582f7\": rpc error: code = NotFound desc = could not find container \"58baae42772a91e78286608fa8512ed51b1a81b1978130e4e0d1f8da5d7582f7\": container with ID starting with 58baae42772a91e78286608fa8512ed51b1a81b1978130e4e0d1f8da5d7582f7 not found: ID does not exist" Mar 19 17:55:58 crc kubenswrapper[4918]: I0319 17:55:58.012662 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04f9b0c1-e5a0-4211-ab03-1cd87f844489" (UID: "04f9b0c1-e5a0-4211-ab03-1cd87f844489"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:55:58 crc kubenswrapper[4918]: I0319 17:55:58.058243 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04f9b0c1-e5a0-4211-ab03-1cd87f844489-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:55:58 crc kubenswrapper[4918]: I0319 17:55:58.058289 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44vjz\" (UniqueName: \"kubernetes.io/projected/04f9b0c1-e5a0-4211-ab03-1cd87f844489-kube-api-access-44vjz\") on node \"crc\" DevicePath \"\"" Mar 19 17:55:58 crc kubenswrapper[4918]: I0319 17:55:58.166244 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xhfsr"] Mar 19 17:55:58 crc kubenswrapper[4918]: I0319 17:55:58.178956 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xhfsr"] Mar 19 17:55:58 crc kubenswrapper[4918]: I0319 17:55:58.601499 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" path="/var/lib/kubelet/pods/04f9b0c1-e5a0-4211-ab03-1cd87f844489/volumes" Mar 19 17:55:59 crc kubenswrapper[4918]: I0319 17:55:59.587213 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:55:59 crc kubenswrapper[4918]: E0319 17:55:59.587437 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.145340 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565716-mjv7f"] Mar 19 17:56:00 crc kubenswrapper[4918]: E0319 17:56:00.146058 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" containerName="extract-utilities" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.146076 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" containerName="extract-utilities" Mar 19 17:56:00 crc kubenswrapper[4918]: E0319 17:56:00.146092 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" containerName="registry-server" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.146098 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" containerName="registry-server" Mar 19 17:56:00 crc kubenswrapper[4918]: E0319 17:56:00.146111 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" containerName="extract-content" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.146116 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" containerName="extract-content" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.146322 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f9b0c1-e5a0-4211-ab03-1cd87f844489" containerName="registry-server" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.147057 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565716-mjv7f" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.150535 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.150729 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.165323 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.165639 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565716-mjv7f"] Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.334507 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjw4f\" (UniqueName: \"kubernetes.io/projected/a048bf2d-db47-4755-9ad4-309a86be6f8c-kube-api-access-qjw4f\") pod \"auto-csr-approver-29565716-mjv7f\" (UID: \"a048bf2d-db47-4755-9ad4-309a86be6f8c\") " pod="openshift-infra/auto-csr-approver-29565716-mjv7f" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.436336 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjw4f\" (UniqueName: \"kubernetes.io/projected/a048bf2d-db47-4755-9ad4-309a86be6f8c-kube-api-access-qjw4f\") pod \"auto-csr-approver-29565716-mjv7f\" (UID: \"a048bf2d-db47-4755-9ad4-309a86be6f8c\") " pod="openshift-infra/auto-csr-approver-29565716-mjv7f" Mar 19 17:56:00 crc kubenswrapper[4918]: I0319 17:56:00.894830 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjw4f\" (UniqueName: \"kubernetes.io/projected/a048bf2d-db47-4755-9ad4-309a86be6f8c-kube-api-access-qjw4f\") pod \"auto-csr-approver-29565716-mjv7f\" (UID: \"a048bf2d-db47-4755-9ad4-309a86be6f8c\") " pod="openshift-infra/auto-csr-approver-29565716-mjv7f" Mar 19 17:56:01 crc kubenswrapper[4918]: I0319 17:56:01.068016 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565716-mjv7f" Mar 19 17:56:01 crc kubenswrapper[4918]: I0319 17:56:01.698318 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565716-mjv7f"] Mar 19 17:56:01 crc kubenswrapper[4918]: I0319 17:56:01.897004 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565716-mjv7f" event={"ID":"a048bf2d-db47-4755-9ad4-309a86be6f8c","Type":"ContainerStarted","Data":"94539f55c290139aafebe0226fcf83532cb84c070e49a8b11c3fa03d29c1cb0e"} Mar 19 17:56:04 crc kubenswrapper[4918]: I0319 17:56:04.938431 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565716-mjv7f" event={"ID":"a048bf2d-db47-4755-9ad4-309a86be6f8c","Type":"ContainerStarted","Data":"e2af64416893745b6ec19835b53582ed72518c6dbd8a18832bdc6451ce16c946"} Mar 19 17:56:05 crc kubenswrapper[4918]: I0319 17:56:05.948253 4918 generic.go:334] "Generic (PLEG): container finished" podID="a048bf2d-db47-4755-9ad4-309a86be6f8c" containerID="e2af64416893745b6ec19835b53582ed72518c6dbd8a18832bdc6451ce16c946" exitCode=0 Mar 19 17:56:05 crc kubenswrapper[4918]: I0319 17:56:05.948454 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565716-mjv7f" event={"ID":"a048bf2d-db47-4755-9ad4-309a86be6f8c","Type":"ContainerDied","Data":"e2af64416893745b6ec19835b53582ed72518c6dbd8a18832bdc6451ce16c946"} Mar 19 17:56:06 crc kubenswrapper[4918]: I0319 17:56:06.644161 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565716-mjv7f" Mar 19 17:56:06 crc kubenswrapper[4918]: I0319 17:56:06.766386 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjw4f\" (UniqueName: \"kubernetes.io/projected/a048bf2d-db47-4755-9ad4-309a86be6f8c-kube-api-access-qjw4f\") pod \"a048bf2d-db47-4755-9ad4-309a86be6f8c\" (UID: \"a048bf2d-db47-4755-9ad4-309a86be6f8c\") " Mar 19 17:56:06 crc kubenswrapper[4918]: I0319 17:56:06.794695 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a048bf2d-db47-4755-9ad4-309a86be6f8c-kube-api-access-qjw4f" (OuterVolumeSpecName: "kube-api-access-qjw4f") pod "a048bf2d-db47-4755-9ad4-309a86be6f8c" (UID: "a048bf2d-db47-4755-9ad4-309a86be6f8c"). InnerVolumeSpecName "kube-api-access-qjw4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:56:06 crc kubenswrapper[4918]: I0319 17:56:06.868673 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjw4f\" (UniqueName: \"kubernetes.io/projected/a048bf2d-db47-4755-9ad4-309a86be6f8c-kube-api-access-qjw4f\") on node \"crc\" DevicePath \"\"" Mar 19 17:56:06 crc kubenswrapper[4918]: I0319 17:56:06.962211 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565716-mjv7f" event={"ID":"a048bf2d-db47-4755-9ad4-309a86be6f8c","Type":"ContainerDied","Data":"94539f55c290139aafebe0226fcf83532cb84c070e49a8b11c3fa03d29c1cb0e"} Mar 19 17:56:06 crc kubenswrapper[4918]: I0319 17:56:06.962251 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94539f55c290139aafebe0226fcf83532cb84c070e49a8b11c3fa03d29c1cb0e" Mar 19 17:56:06 crc kubenswrapper[4918]: I0319 17:56:06.962304 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565716-mjv7f" Mar 19 17:56:07 crc kubenswrapper[4918]: I0319 17:56:07.710355 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565710-skpr2"] Mar 19 17:56:07 crc kubenswrapper[4918]: I0319 17:56:07.718826 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565710-skpr2"] Mar 19 17:56:08 crc kubenswrapper[4918]: I0319 17:56:08.597126 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ea78e4-d96a-4487-893e-1a2dcc8c5fe0" path="/var/lib/kubelet/pods/93ea78e4-d96a-4487-893e-1a2dcc8c5fe0/volumes" Mar 19 17:56:11 crc kubenswrapper[4918]: I0319 17:56:11.588293 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:56:11 crc kubenswrapper[4918]: E0319 17:56:11.588789 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:56:15 crc kubenswrapper[4918]: I0319 17:56:15.589850 4918 scope.go:117] "RemoveContainer" containerID="e4b187e93ab45989c128c3451c9f38867130a064d968a1e7c68a4d6446079a8d" Mar 19 17:56:24 crc kubenswrapper[4918]: I0319 17:56:24.586385 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:56:24 crc kubenswrapper[4918]: E0319 17:56:24.587066 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:56:36 crc kubenswrapper[4918]: I0319 17:56:36.587091 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:56:36 crc kubenswrapper[4918]: E0319 17:56:36.588608 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.374042 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mbkzc"] Mar 19 17:56:37 crc kubenswrapper[4918]: E0319 17:56:37.374693 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a048bf2d-db47-4755-9ad4-309a86be6f8c" containerName="oc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.374709 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a048bf2d-db47-4755-9ad4-309a86be6f8c" containerName="oc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.374914 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="a048bf2d-db47-4755-9ad4-309a86be6f8c" containerName="oc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.376572 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.390108 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mbkzc"] Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.517367 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-catalog-content\") pod \"redhat-marketplace-mbkzc\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.517420 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-utilities\") pod \"redhat-marketplace-mbkzc\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.517453 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfsqn\" (UniqueName: \"kubernetes.io/projected/0a27c801-49bb-455d-ba87-bcabc906491f-kube-api-access-gfsqn\") pod \"redhat-marketplace-mbkzc\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.619645 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-catalog-content\") pod \"redhat-marketplace-mbkzc\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.619710 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-utilities\") pod \"redhat-marketplace-mbkzc\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.619744 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfsqn\" (UniqueName: \"kubernetes.io/projected/0a27c801-49bb-455d-ba87-bcabc906491f-kube-api-access-gfsqn\") pod \"redhat-marketplace-mbkzc\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.620204 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-catalog-content\") pod \"redhat-marketplace-mbkzc\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.620219 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-utilities\") pod \"redhat-marketplace-mbkzc\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:37 crc kubenswrapper[4918]: I0319 17:56:37.994376 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfsqn\" (UniqueName: \"kubernetes.io/projected/0a27c801-49bb-455d-ba87-bcabc906491f-kube-api-access-gfsqn\") pod \"redhat-marketplace-mbkzc\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:38 crc kubenswrapper[4918]: I0319 17:56:38.009199 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:38 crc kubenswrapper[4918]: I0319 17:56:38.842182 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mbkzc"] Mar 19 17:56:39 crc kubenswrapper[4918]: I0319 17:56:39.290079 4918 generic.go:334] "Generic (PLEG): container finished" podID="0a27c801-49bb-455d-ba87-bcabc906491f" containerID="df11e55e4c8343f22a0808bfbdef3041fc40a0c750b869823f4881b01ee8f7d8" exitCode=0 Mar 19 17:56:39 crc kubenswrapper[4918]: I0319 17:56:39.290191 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mbkzc" event={"ID":"0a27c801-49bb-455d-ba87-bcabc906491f","Type":"ContainerDied","Data":"df11e55e4c8343f22a0808bfbdef3041fc40a0c750b869823f4881b01ee8f7d8"} Mar 19 17:56:39 crc kubenswrapper[4918]: I0319 17:56:39.290323 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mbkzc" event={"ID":"0a27c801-49bb-455d-ba87-bcabc906491f","Type":"ContainerStarted","Data":"3655eab33eab86ef3fbf807cdf0563d4890355e681123ba8a256dd7db631abb1"} Mar 19 17:56:41 crc kubenswrapper[4918]: I0319 17:56:41.315130 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mbkzc" event={"ID":"0a27c801-49bb-455d-ba87-bcabc906491f","Type":"ContainerStarted","Data":"67f668f013bdb61588650858d4f55688a1d51fd3881e78cff2db890942b35d1a"} Mar 19 17:56:42 crc kubenswrapper[4918]: I0319 17:56:42.325815 4918 generic.go:334] "Generic (PLEG): container finished" podID="0a27c801-49bb-455d-ba87-bcabc906491f" containerID="67f668f013bdb61588650858d4f55688a1d51fd3881e78cff2db890942b35d1a" exitCode=0 Mar 19 17:56:42 crc kubenswrapper[4918]: I0319 17:56:42.325859 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mbkzc" event={"ID":"0a27c801-49bb-455d-ba87-bcabc906491f","Type":"ContainerDied","Data":"67f668f013bdb61588650858d4f55688a1d51fd3881e78cff2db890942b35d1a"} Mar 19 17:56:43 crc kubenswrapper[4918]: I0319 17:56:43.350046 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mbkzc" event={"ID":"0a27c801-49bb-455d-ba87-bcabc906491f","Type":"ContainerStarted","Data":"33a31230fa92aa843a4946be12bbcaefc4ce2c45dbc6a99f816d03a849552705"} Mar 19 17:56:43 crc kubenswrapper[4918]: I0319 17:56:43.392325 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mbkzc" podStartSLOduration=2.932228936 podStartE2EDuration="6.392302076s" podCreationTimestamp="2026-03-19 17:56:37 +0000 UTC" firstStartedPulling="2026-03-19 17:56:39.291695412 +0000 UTC m=+4611.413894660" lastFinishedPulling="2026-03-19 17:56:42.751768562 +0000 UTC m=+4614.873967800" observedRunningTime="2026-03-19 17:56:43.378777504 +0000 UTC m=+4615.500976752" watchObservedRunningTime="2026-03-19 17:56:43.392302076 +0000 UTC m=+4615.514501324" Mar 19 17:56:48 crc kubenswrapper[4918]: I0319 17:56:48.009990 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:48 crc kubenswrapper[4918]: I0319 17:56:48.010578 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:49 crc kubenswrapper[4918]: I0319 17:56:49.062114 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mbkzc" podUID="0a27c801-49bb-455d-ba87-bcabc906491f" containerName="registry-server" probeResult="failure" output=< Mar 19 17:56:49 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 17:56:49 crc kubenswrapper[4918]: > Mar 19 17:56:50 crc kubenswrapper[4918]: I0319 17:56:50.586391 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:56:50 crc kubenswrapper[4918]: E0319 17:56:50.586879 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:56:58 crc kubenswrapper[4918]: I0319 17:56:58.055488 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:58 crc kubenswrapper[4918]: I0319 17:56:58.108544 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:56:58 crc kubenswrapper[4918]: I0319 17:56:58.297824 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mbkzc"] Mar 19 17:56:59 crc kubenswrapper[4918]: I0319 17:56:59.507224 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mbkzc" podUID="0a27c801-49bb-455d-ba87-bcabc906491f" containerName="registry-server" containerID="cri-o://33a31230fa92aa843a4946be12bbcaefc4ce2c45dbc6a99f816d03a849552705" gracePeriod=2 Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.518536 4918 generic.go:334] "Generic (PLEG): container finished" podID="0a27c801-49bb-455d-ba87-bcabc906491f" containerID="33a31230fa92aa843a4946be12bbcaefc4ce2c45dbc6a99f816d03a849552705" exitCode=0 Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.518557 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mbkzc" event={"ID":"0a27c801-49bb-455d-ba87-bcabc906491f","Type":"ContainerDied","Data":"33a31230fa92aa843a4946be12bbcaefc4ce2c45dbc6a99f816d03a849552705"} Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.711202 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.799433 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-catalog-content\") pod \"0a27c801-49bb-455d-ba87-bcabc906491f\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.799490 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfsqn\" (UniqueName: \"kubernetes.io/projected/0a27c801-49bb-455d-ba87-bcabc906491f-kube-api-access-gfsqn\") pod \"0a27c801-49bb-455d-ba87-bcabc906491f\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.799774 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-utilities\") pod \"0a27c801-49bb-455d-ba87-bcabc906491f\" (UID: \"0a27c801-49bb-455d-ba87-bcabc906491f\") " Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.801566 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-utilities" (OuterVolumeSpecName: "utilities") pod "0a27c801-49bb-455d-ba87-bcabc906491f" (UID: "0a27c801-49bb-455d-ba87-bcabc906491f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.806037 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a27c801-49bb-455d-ba87-bcabc906491f-kube-api-access-gfsqn" (OuterVolumeSpecName: "kube-api-access-gfsqn") pod "0a27c801-49bb-455d-ba87-bcabc906491f" (UID: "0a27c801-49bb-455d-ba87-bcabc906491f"). InnerVolumeSpecName "kube-api-access-gfsqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.827170 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a27c801-49bb-455d-ba87-bcabc906491f" (UID: "0a27c801-49bb-455d-ba87-bcabc906491f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.902209 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.902244 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a27c801-49bb-455d-ba87-bcabc906491f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:57:00 crc kubenswrapper[4918]: I0319 17:57:00.902263 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfsqn\" (UniqueName: \"kubernetes.io/projected/0a27c801-49bb-455d-ba87-bcabc906491f-kube-api-access-gfsqn\") on node \"crc\" DevicePath \"\"" Mar 19 17:57:01 crc kubenswrapper[4918]: I0319 17:57:01.531884 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mbkzc" event={"ID":"0a27c801-49bb-455d-ba87-bcabc906491f","Type":"ContainerDied","Data":"3655eab33eab86ef3fbf807cdf0563d4890355e681123ba8a256dd7db631abb1"} Mar 19 17:57:01 crc kubenswrapper[4918]: I0319 17:57:01.531935 4918 scope.go:117] "RemoveContainer" containerID="33a31230fa92aa843a4946be12bbcaefc4ce2c45dbc6a99f816d03a849552705" Mar 19 17:57:01 crc kubenswrapper[4918]: I0319 17:57:01.532050 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mbkzc" Mar 19 17:57:01 crc kubenswrapper[4918]: I0319 17:57:01.564795 4918 scope.go:117] "RemoveContainer" containerID="67f668f013bdb61588650858d4f55688a1d51fd3881e78cff2db890942b35d1a" Mar 19 17:57:01 crc kubenswrapper[4918]: I0319 17:57:01.587658 4918 scope.go:117] "RemoveContainer" containerID="df11e55e4c8343f22a0808bfbdef3041fc40a0c750b869823f4881b01ee8f7d8" Mar 19 17:57:01 crc kubenswrapper[4918]: I0319 17:57:01.630781 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mbkzc"] Mar 19 17:57:01 crc kubenswrapper[4918]: I0319 17:57:01.659857 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mbkzc"] Mar 19 17:57:02 crc kubenswrapper[4918]: I0319 17:57:02.597986 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a27c801-49bb-455d-ba87-bcabc906491f" path="/var/lib/kubelet/pods/0a27c801-49bb-455d-ba87-bcabc906491f/volumes" Mar 19 17:57:04 crc kubenswrapper[4918]: I0319 17:57:04.586683 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:57:04 crc kubenswrapper[4918]: E0319 17:57:04.587651 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:57:19 crc kubenswrapper[4918]: I0319 17:57:19.586637 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:57:19 crc kubenswrapper[4918]: E0319 17:57:19.587384 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 17:57:31 crc kubenswrapper[4918]: I0319 17:57:31.586091 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 17:57:31 crc kubenswrapper[4918]: I0319 17:57:31.819685 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"683e2842523828a0b01acdee58435737f1109527ab15ad1352239626a83f2287"} Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.142031 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565718-cwrrf"] Mar 19 17:58:00 crc kubenswrapper[4918]: E0319 17:58:00.143927 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a27c801-49bb-455d-ba87-bcabc906491f" containerName="registry-server" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.143998 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a27c801-49bb-455d-ba87-bcabc906491f" containerName="registry-server" Mar 19 17:58:00 crc kubenswrapper[4918]: E0319 17:58:00.144067 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a27c801-49bb-455d-ba87-bcabc906491f" containerName="extract-utilities" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.144124 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a27c801-49bb-455d-ba87-bcabc906491f" containerName="extract-utilities" Mar 19 17:58:00 crc kubenswrapper[4918]: E0319 17:58:00.144206 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a27c801-49bb-455d-ba87-bcabc906491f" containerName="extract-content" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.144264 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a27c801-49bb-455d-ba87-bcabc906491f" containerName="extract-content" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.144508 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a27c801-49bb-455d-ba87-bcabc906491f" containerName="registry-server" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.145294 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565718-cwrrf" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.148273 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.149119 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.152555 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.155034 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565718-cwrrf"] Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.184489 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p4pz\" (UniqueName: \"kubernetes.io/projected/315ac8a3-68b8-445f-9ed2-6c1e7c541456-kube-api-access-7p4pz\") pod \"auto-csr-approver-29565718-cwrrf\" (UID: \"315ac8a3-68b8-445f-9ed2-6c1e7c541456\") " pod="openshift-infra/auto-csr-approver-29565718-cwrrf" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.286872 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p4pz\" (UniqueName: \"kubernetes.io/projected/315ac8a3-68b8-445f-9ed2-6c1e7c541456-kube-api-access-7p4pz\") pod \"auto-csr-approver-29565718-cwrrf\" (UID: \"315ac8a3-68b8-445f-9ed2-6c1e7c541456\") " pod="openshift-infra/auto-csr-approver-29565718-cwrrf" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.315290 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p4pz\" (UniqueName: \"kubernetes.io/projected/315ac8a3-68b8-445f-9ed2-6c1e7c541456-kube-api-access-7p4pz\") pod \"auto-csr-approver-29565718-cwrrf\" (UID: \"315ac8a3-68b8-445f-9ed2-6c1e7c541456\") " pod="openshift-infra/auto-csr-approver-29565718-cwrrf" Mar 19 17:58:00 crc kubenswrapper[4918]: I0319 17:58:00.465462 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565718-cwrrf" Mar 19 17:58:01 crc kubenswrapper[4918]: I0319 17:58:01.335058 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565718-cwrrf"] Mar 19 17:58:02 crc kubenswrapper[4918]: I0319 17:58:02.194556 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565718-cwrrf" event={"ID":"315ac8a3-68b8-445f-9ed2-6c1e7c541456","Type":"ContainerStarted","Data":"152dee7e9d98577e61d2e263afc5f8103f18543965539a9ed9612a3b4f768a64"} Mar 19 17:58:03 crc kubenswrapper[4918]: I0319 17:58:03.203738 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565718-cwrrf" event={"ID":"315ac8a3-68b8-445f-9ed2-6c1e7c541456","Type":"ContainerStarted","Data":"514ab7679ede40f3ca2e4a469d4ca1f340e298ee7139e8907d13edf484ee5989"} Mar 19 17:58:03 crc kubenswrapper[4918]: I0319 17:58:03.222547 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565718-cwrrf" podStartSLOduration=1.841780617 podStartE2EDuration="3.222532425s" podCreationTimestamp="2026-03-19 17:58:00 +0000 UTC" firstStartedPulling="2026-03-19 17:58:01.378870894 +0000 UTC m=+4693.501070142" lastFinishedPulling="2026-03-19 17:58:02.759622702 +0000 UTC m=+4694.881821950" observedRunningTime="2026-03-19 17:58:03.215156952 +0000 UTC m=+4695.337356200" watchObservedRunningTime="2026-03-19 17:58:03.222532425 +0000 UTC m=+4695.344731673" Mar 19 17:58:05 crc kubenswrapper[4918]: I0319 17:58:05.221182 4918 generic.go:334] "Generic (PLEG): container finished" podID="315ac8a3-68b8-445f-9ed2-6c1e7c541456" containerID="514ab7679ede40f3ca2e4a469d4ca1f340e298ee7139e8907d13edf484ee5989" exitCode=0 Mar 19 17:58:05 crc kubenswrapper[4918]: I0319 17:58:05.221265 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565718-cwrrf" event={"ID":"315ac8a3-68b8-445f-9ed2-6c1e7c541456","Type":"ContainerDied","Data":"514ab7679ede40f3ca2e4a469d4ca1f340e298ee7139e8907d13edf484ee5989"} Mar 19 17:58:07 crc kubenswrapper[4918]: I0319 17:58:07.538848 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565718-cwrrf" Mar 19 17:58:07 crc kubenswrapper[4918]: I0319 17:58:07.640421 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p4pz\" (UniqueName: \"kubernetes.io/projected/315ac8a3-68b8-445f-9ed2-6c1e7c541456-kube-api-access-7p4pz\") pod \"315ac8a3-68b8-445f-9ed2-6c1e7c541456\" (UID: \"315ac8a3-68b8-445f-9ed2-6c1e7c541456\") " Mar 19 17:58:07 crc kubenswrapper[4918]: I0319 17:58:07.648691 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315ac8a3-68b8-445f-9ed2-6c1e7c541456-kube-api-access-7p4pz" (OuterVolumeSpecName: "kube-api-access-7p4pz") pod "315ac8a3-68b8-445f-9ed2-6c1e7c541456" (UID: "315ac8a3-68b8-445f-9ed2-6c1e7c541456"). InnerVolumeSpecName "kube-api-access-7p4pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:58:07 crc kubenswrapper[4918]: I0319 17:58:07.744196 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p4pz\" (UniqueName: \"kubernetes.io/projected/315ac8a3-68b8-445f-9ed2-6c1e7c541456-kube-api-access-7p4pz\") on node \"crc\" DevicePath \"\"" Mar 19 17:58:08 crc kubenswrapper[4918]: I0319 17:58:08.245921 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565718-cwrrf" event={"ID":"315ac8a3-68b8-445f-9ed2-6c1e7c541456","Type":"ContainerDied","Data":"152dee7e9d98577e61d2e263afc5f8103f18543965539a9ed9612a3b4f768a64"} Mar 19 17:58:08 crc kubenswrapper[4918]: I0319 17:58:08.246180 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="152dee7e9d98577e61d2e263afc5f8103f18543965539a9ed9612a3b4f768a64" Mar 19 17:58:08 crc kubenswrapper[4918]: I0319 17:58:08.245988 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565718-cwrrf" Mar 19 17:58:08 crc kubenswrapper[4918]: I0319 17:58:08.616021 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565712-6mmf7"] Mar 19 17:58:08 crc kubenswrapper[4918]: I0319 17:58:08.625937 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565712-6mmf7"] Mar 19 17:58:10 crc kubenswrapper[4918]: I0319 17:58:10.597933 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d930d5a5-8b98-4d00-91b1-9010cd70d92a" path="/var/lib/kubelet/pods/d930d5a5-8b98-4d00-91b1-9010cd70d92a/volumes" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.245225 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4h2gh"] Mar 19 17:58:13 crc kubenswrapper[4918]: E0319 17:58:13.246113 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315ac8a3-68b8-445f-9ed2-6c1e7c541456" containerName="oc" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.246130 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="315ac8a3-68b8-445f-9ed2-6c1e7c541456" containerName="oc" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.246414 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="315ac8a3-68b8-445f-9ed2-6c1e7c541456" containerName="oc" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.248944 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.262403 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4h2gh"] Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.350782 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-utilities\") pod \"certified-operators-4h2gh\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.350868 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvgss\" (UniqueName: \"kubernetes.io/projected/0b78f376-5737-477f-8c02-e9b47627cd52-kube-api-access-lvgss\") pod \"certified-operators-4h2gh\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.350917 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-catalog-content\") pod \"certified-operators-4h2gh\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.453006 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-utilities\") pod \"certified-operators-4h2gh\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.453091 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvgss\" (UniqueName: \"kubernetes.io/projected/0b78f376-5737-477f-8c02-e9b47627cd52-kube-api-access-lvgss\") pod \"certified-operators-4h2gh\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.453140 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-catalog-content\") pod \"certified-operators-4h2gh\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.453685 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-utilities\") pod \"certified-operators-4h2gh\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.454084 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-catalog-content\") pod \"certified-operators-4h2gh\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.472893 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvgss\" (UniqueName: \"kubernetes.io/projected/0b78f376-5737-477f-8c02-e9b47627cd52-kube-api-access-lvgss\") pod \"certified-operators-4h2gh\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:13 crc kubenswrapper[4918]: I0319 17:58:13.590072 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:14 crc kubenswrapper[4918]: I0319 17:58:14.464230 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4h2gh"] Mar 19 17:58:15 crc kubenswrapper[4918]: I0319 17:58:15.323450 4918 generic.go:334] "Generic (PLEG): container finished" podID="0b78f376-5737-477f-8c02-e9b47627cd52" containerID="7494b8a0a89b2f9dace2f660dd9b4fbad6aff52a21f9d7efeb7190ab4e5a47cc" exitCode=0 Mar 19 17:58:15 crc kubenswrapper[4918]: I0319 17:58:15.323699 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h2gh" event={"ID":"0b78f376-5737-477f-8c02-e9b47627cd52","Type":"ContainerDied","Data":"7494b8a0a89b2f9dace2f660dd9b4fbad6aff52a21f9d7efeb7190ab4e5a47cc"} Mar 19 17:58:15 crc kubenswrapper[4918]: I0319 17:58:15.323725 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h2gh" event={"ID":"0b78f376-5737-477f-8c02-e9b47627cd52","Type":"ContainerStarted","Data":"cb398d3bc26fb15f23475697fe4136a14b7cb4573789a381b07fad873f39f243"} Mar 19 17:58:15 crc kubenswrapper[4918]: I0319 17:58:15.721259 4918 scope.go:117] "RemoveContainer" containerID="fcc836b9043aa89cdd0e32658af25010ba8be06578bd924576762e8f074b787f" Mar 19 17:58:17 crc kubenswrapper[4918]: I0319 17:58:17.365057 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h2gh" event={"ID":"0b78f376-5737-477f-8c02-e9b47627cd52","Type":"ContainerStarted","Data":"93a09da66b1bb639d47ea614c5ab09ccb5281aa044a0cd383944045228695649"} Mar 19 17:58:19 crc kubenswrapper[4918]: I0319 17:58:19.383324 4918 generic.go:334] "Generic (PLEG): container finished" podID="0b78f376-5737-477f-8c02-e9b47627cd52" containerID="93a09da66b1bb639d47ea614c5ab09ccb5281aa044a0cd383944045228695649" exitCode=0 Mar 19 17:58:19 crc kubenswrapper[4918]: I0319 17:58:19.383375 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h2gh" event={"ID":"0b78f376-5737-477f-8c02-e9b47627cd52","Type":"ContainerDied","Data":"93a09da66b1bb639d47ea614c5ab09ccb5281aa044a0cd383944045228695649"} Mar 19 17:58:20 crc kubenswrapper[4918]: I0319 17:58:20.395602 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h2gh" event={"ID":"0b78f376-5737-477f-8c02-e9b47627cd52","Type":"ContainerStarted","Data":"bbd91791f58f38a8fca3b2d7f2b01dc537194ae0d173852c909dfe4087786ef5"} Mar 19 17:58:20 crc kubenswrapper[4918]: I0319 17:58:20.412721 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4h2gh" podStartSLOduration=2.931739713 podStartE2EDuration="7.412699596s" podCreationTimestamp="2026-03-19 17:58:13 +0000 UTC" firstStartedPulling="2026-03-19 17:58:15.325207523 +0000 UTC m=+4707.447406771" lastFinishedPulling="2026-03-19 17:58:19.806167406 +0000 UTC m=+4711.928366654" observedRunningTime="2026-03-19 17:58:20.409464957 +0000 UTC m=+4712.531664225" watchObservedRunningTime="2026-03-19 17:58:20.412699596 +0000 UTC m=+4712.534898844" Mar 19 17:58:23 crc kubenswrapper[4918]: I0319 17:58:23.591630 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:23 crc kubenswrapper[4918]: I0319 17:58:23.593183 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:23 crc kubenswrapper[4918]: I0319 17:58:23.646105 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:25 crc kubenswrapper[4918]: I0319 17:58:25.494216 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:25 crc kubenswrapper[4918]: I0319 17:58:25.568261 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4h2gh"] Mar 19 17:58:27 crc kubenswrapper[4918]: I0319 17:58:27.449278 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4h2gh" podUID="0b78f376-5737-477f-8c02-e9b47627cd52" containerName="registry-server" containerID="cri-o://bbd91791f58f38a8fca3b2d7f2b01dc537194ae0d173852c909dfe4087786ef5" gracePeriod=2 Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.458706 4918 generic.go:334] "Generic (PLEG): container finished" podID="0b78f376-5737-477f-8c02-e9b47627cd52" containerID="bbd91791f58f38a8fca3b2d7f2b01dc537194ae0d173852c909dfe4087786ef5" exitCode=0 Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.458784 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h2gh" event={"ID":"0b78f376-5737-477f-8c02-e9b47627cd52","Type":"ContainerDied","Data":"bbd91791f58f38a8fca3b2d7f2b01dc537194ae0d173852c909dfe4087786ef5"} Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.796630 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.882920 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvgss\" (UniqueName: \"kubernetes.io/projected/0b78f376-5737-477f-8c02-e9b47627cd52-kube-api-access-lvgss\") pod \"0b78f376-5737-477f-8c02-e9b47627cd52\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.882969 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-catalog-content\") pod \"0b78f376-5737-477f-8c02-e9b47627cd52\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.883264 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-utilities\") pod \"0b78f376-5737-477f-8c02-e9b47627cd52\" (UID: \"0b78f376-5737-477f-8c02-e9b47627cd52\") " Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.884140 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-utilities" (OuterVolumeSpecName: "utilities") pod "0b78f376-5737-477f-8c02-e9b47627cd52" (UID: "0b78f376-5737-477f-8c02-e9b47627cd52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.884668 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.908414 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78f376-5737-477f-8c02-e9b47627cd52-kube-api-access-lvgss" (OuterVolumeSpecName: "kube-api-access-lvgss") pod "0b78f376-5737-477f-8c02-e9b47627cd52" (UID: "0b78f376-5737-477f-8c02-e9b47627cd52"). InnerVolumeSpecName "kube-api-access-lvgss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.955057 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b78f376-5737-477f-8c02-e9b47627cd52" (UID: "0b78f376-5737-477f-8c02-e9b47627cd52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.987091 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvgss\" (UniqueName: \"kubernetes.io/projected/0b78f376-5737-477f-8c02-e9b47627cd52-kube-api-access-lvgss\") on node \"crc\" DevicePath \"\"" Mar 19 17:58:28 crc kubenswrapper[4918]: I0319 17:58:28.987349 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b78f376-5737-477f-8c02-e9b47627cd52-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 17:58:29 crc kubenswrapper[4918]: I0319 17:58:29.468731 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h2gh" event={"ID":"0b78f376-5737-477f-8c02-e9b47627cd52","Type":"ContainerDied","Data":"cb398d3bc26fb15f23475697fe4136a14b7cb4573789a381b07fad873f39f243"} Mar 19 17:58:29 crc kubenswrapper[4918]: I0319 17:58:29.468787 4918 scope.go:117] "RemoveContainer" containerID="bbd91791f58f38a8fca3b2d7f2b01dc537194ae0d173852c909dfe4087786ef5" Mar 19 17:58:29 crc kubenswrapper[4918]: I0319 17:58:29.468825 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4h2gh" Mar 19 17:58:29 crc kubenswrapper[4918]: I0319 17:58:29.511938 4918 scope.go:117] "RemoveContainer" containerID="93a09da66b1bb639d47ea614c5ab09ccb5281aa044a0cd383944045228695649" Mar 19 17:58:29 crc kubenswrapper[4918]: I0319 17:58:29.536549 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4h2gh"] Mar 19 17:58:29 crc kubenswrapper[4918]: I0319 17:58:29.551482 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4h2gh"] Mar 19 17:58:29 crc kubenswrapper[4918]: I0319 17:58:29.583357 4918 scope.go:117] "RemoveContainer" containerID="7494b8a0a89b2f9dace2f660dd9b4fbad6aff52a21f9d7efeb7190ab4e5a47cc" Mar 19 17:58:30 crc kubenswrapper[4918]: I0319 17:58:30.596348 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78f376-5737-477f-8c02-e9b47627cd52" path="/var/lib/kubelet/pods/0b78f376-5737-477f-8c02-e9b47627cd52/volumes" Mar 19 17:59:58 crc kubenswrapper[4918]: I0319 17:59:58.211762 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 17:59:58 crc kubenswrapper[4918]: I0319 17:59:58.212191 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.147226 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565720-4pr9h"] Mar 19 18:00:00 crc kubenswrapper[4918]: E0319 18:00:00.148090 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b78f376-5737-477f-8c02-e9b47627cd52" containerName="extract-utilities" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.148101 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b78f376-5737-477f-8c02-e9b47627cd52" containerName="extract-utilities" Mar 19 18:00:00 crc kubenswrapper[4918]: E0319 18:00:00.148128 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b78f376-5737-477f-8c02-e9b47627cd52" containerName="registry-server" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.148135 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b78f376-5737-477f-8c02-e9b47627cd52" containerName="registry-server" Mar 19 18:00:00 crc kubenswrapper[4918]: E0319 18:00:00.148164 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b78f376-5737-477f-8c02-e9b47627cd52" containerName="extract-content" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.148170 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b78f376-5737-477f-8c02-e9b47627cd52" containerName="extract-content" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.148363 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b78f376-5737-477f-8c02-e9b47627cd52" containerName="registry-server" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.149043 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565720-4pr9h" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.153146 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.153301 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.153475 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.157080 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565720-4pr9h"] Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.270716 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m"] Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.272155 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.273689 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.274468 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.282750 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m"] Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.293607 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjht6\" (UniqueName: \"kubernetes.io/projected/d390c826-1356-447f-b1da-72f57e7accbc-kube-api-access-fjht6\") pod \"auto-csr-approver-29565720-4pr9h\" (UID: \"d390c826-1356-447f-b1da-72f57e7accbc\") " pod="openshift-infra/auto-csr-approver-29565720-4pr9h" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.293710 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d72809d-a071-401b-a44f-afdcddb39424-secret-volume\") pod \"collect-profiles-29565720-fz22m\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.293737 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwwrk\" (UniqueName: \"kubernetes.io/projected/1d72809d-a071-401b-a44f-afdcddb39424-kube-api-access-fwwrk\") pod \"collect-profiles-29565720-fz22m\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.294243 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d72809d-a071-401b-a44f-afdcddb39424-config-volume\") pod \"collect-profiles-29565720-fz22m\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.396028 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d72809d-a071-401b-a44f-afdcddb39424-config-volume\") pod \"collect-profiles-29565720-fz22m\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.396091 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjht6\" (UniqueName: \"kubernetes.io/projected/d390c826-1356-447f-b1da-72f57e7accbc-kube-api-access-fjht6\") pod \"auto-csr-approver-29565720-4pr9h\" (UID: \"d390c826-1356-447f-b1da-72f57e7accbc\") " pod="openshift-infra/auto-csr-approver-29565720-4pr9h" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.396161 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d72809d-a071-401b-a44f-afdcddb39424-secret-volume\") pod \"collect-profiles-29565720-fz22m\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.396185 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwwrk\" (UniqueName: \"kubernetes.io/projected/1d72809d-a071-401b-a44f-afdcddb39424-kube-api-access-fwwrk\") pod \"collect-profiles-29565720-fz22m\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.397241 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d72809d-a071-401b-a44f-afdcddb39424-config-volume\") pod \"collect-profiles-29565720-fz22m\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.414463 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d72809d-a071-401b-a44f-afdcddb39424-secret-volume\") pod \"collect-profiles-29565720-fz22m\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.416126 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjht6\" (UniqueName: \"kubernetes.io/projected/d390c826-1356-447f-b1da-72f57e7accbc-kube-api-access-fjht6\") pod \"auto-csr-approver-29565720-4pr9h\" (UID: \"d390c826-1356-447f-b1da-72f57e7accbc\") " pod="openshift-infra/auto-csr-approver-29565720-4pr9h" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.429173 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwwrk\" (UniqueName: \"kubernetes.io/projected/1d72809d-a071-401b-a44f-afdcddb39424-kube-api-access-fwwrk\") pod \"collect-profiles-29565720-fz22m\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.545829 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565720-4pr9h" Mar 19 18:00:00 crc kubenswrapper[4918]: I0319 18:00:00.592062 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:01 crc kubenswrapper[4918]: I0319 18:00:01.364339 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565720-4pr9h"] Mar 19 18:00:01 crc kubenswrapper[4918]: I0319 18:00:01.378987 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 18:00:01 crc kubenswrapper[4918]: I0319 18:00:01.728656 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m"] Mar 19 18:00:02 crc kubenswrapper[4918]: I0319 18:00:02.379390 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" event={"ID":"1d72809d-a071-401b-a44f-afdcddb39424","Type":"ContainerStarted","Data":"58dd627e978499892f29b0e426ff1eac0f671fe39ab6162066a7414f8014efce"} Mar 19 18:00:02 crc kubenswrapper[4918]: I0319 18:00:02.379671 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" event={"ID":"1d72809d-a071-401b-a44f-afdcddb39424","Type":"ContainerStarted","Data":"62b81ec780785c8b59dadac37a18ddebc56d1ed1def151e4c3969b309528953f"} Mar 19 18:00:02 crc kubenswrapper[4918]: I0319 18:00:02.382571 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565720-4pr9h" event={"ID":"d390c826-1356-447f-b1da-72f57e7accbc","Type":"ContainerStarted","Data":"c0236fd77cb3bfc36d865dece026ee35e4c2588770aeda09a4f3fa90c1810c01"} Mar 19 18:00:02 crc kubenswrapper[4918]: I0319 18:00:02.407574 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" podStartSLOduration=2.40755127 podStartE2EDuration="2.40755127s" podCreationTimestamp="2026-03-19 18:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:00:02.394852332 +0000 UTC m=+4814.517051580" watchObservedRunningTime="2026-03-19 18:00:02.40755127 +0000 UTC m=+4814.529750508" Mar 19 18:00:03 crc kubenswrapper[4918]: I0319 18:00:03.394359 4918 generic.go:334] "Generic (PLEG): container finished" podID="1d72809d-a071-401b-a44f-afdcddb39424" containerID="58dd627e978499892f29b0e426ff1eac0f671fe39ab6162066a7414f8014efce" exitCode=0 Mar 19 18:00:03 crc kubenswrapper[4918]: I0319 18:00:03.394515 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" event={"ID":"1d72809d-a071-401b-a44f-afdcddb39424","Type":"ContainerDied","Data":"58dd627e978499892f29b0e426ff1eac0f671fe39ab6162066a7414f8014efce"} Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.276483 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.389226 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d72809d-a071-401b-a44f-afdcddb39424-secret-volume\") pod \"1d72809d-a071-401b-a44f-afdcddb39424\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.389352 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d72809d-a071-401b-a44f-afdcddb39424-config-volume\") pod \"1d72809d-a071-401b-a44f-afdcddb39424\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.389444 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwwrk\" (UniqueName: \"kubernetes.io/projected/1d72809d-a071-401b-a44f-afdcddb39424-kube-api-access-fwwrk\") pod \"1d72809d-a071-401b-a44f-afdcddb39424\" (UID: \"1d72809d-a071-401b-a44f-afdcddb39424\") " Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.390322 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d72809d-a071-401b-a44f-afdcddb39424-config-volume" (OuterVolumeSpecName: "config-volume") pod "1d72809d-a071-401b-a44f-afdcddb39424" (UID: "1d72809d-a071-401b-a44f-afdcddb39424"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.396279 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d72809d-a071-401b-a44f-afdcddb39424-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1d72809d-a071-401b-a44f-afdcddb39424" (UID: "1d72809d-a071-401b-a44f-afdcddb39424"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.411145 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" event={"ID":"1d72809d-a071-401b-a44f-afdcddb39424","Type":"ContainerDied","Data":"62b81ec780785c8b59dadac37a18ddebc56d1ed1def151e4c3969b309528953f"} Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.411195 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62b81ec780785c8b59dadac37a18ddebc56d1ed1def151e4c3969b309528953f" Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.411258 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565720-fz22m" Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.413027 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d72809d-a071-401b-a44f-afdcddb39424-kube-api-access-fwwrk" (OuterVolumeSpecName: "kube-api-access-fwwrk") pod "1d72809d-a071-401b-a44f-afdcddb39424" (UID: "1d72809d-a071-401b-a44f-afdcddb39424"). InnerVolumeSpecName "kube-api-access-fwwrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.491823 4918 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d72809d-a071-401b-a44f-afdcddb39424-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.491856 4918 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d72809d-a071-401b-a44f-afdcddb39424-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 18:00:05 crc kubenswrapper[4918]: I0319 18:00:05.491867 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwwrk\" (UniqueName: \"kubernetes.io/projected/1d72809d-a071-401b-a44f-afdcddb39424-kube-api-access-fwwrk\") on node \"crc\" DevicePath \"\"" Mar 19 18:00:06 crc kubenswrapper[4918]: I0319 18:00:06.355749 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf"] Mar 19 18:00:06 crc kubenswrapper[4918]: I0319 18:00:06.366015 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565675-d2fpf"] Mar 19 18:00:06 crc kubenswrapper[4918]: I0319 18:00:06.597896 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22f4bb2c-9cf8-4750-99ea-9d86d180343f" path="/var/lib/kubelet/pods/22f4bb2c-9cf8-4750-99ea-9d86d180343f/volumes" Mar 19 18:00:09 crc kubenswrapper[4918]: I0319 18:00:09.445360 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565720-4pr9h" event={"ID":"d390c826-1356-447f-b1da-72f57e7accbc","Type":"ContainerStarted","Data":"e76f4b19dcc4df6683b8faa6aa2471c0b84b2c8416b224f2c04e6e26abfdba89"} Mar 19 18:00:09 crc kubenswrapper[4918]: I0319 18:00:09.460447 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565720-4pr9h" podStartSLOduration=2.011535271 podStartE2EDuration="9.46042947s" podCreationTimestamp="2026-03-19 18:00:00 +0000 UTC" firstStartedPulling="2026-03-19 18:00:01.37869956 +0000 UTC m=+4813.500898808" lastFinishedPulling="2026-03-19 18:00:08.827593759 +0000 UTC m=+4820.949793007" observedRunningTime="2026-03-19 18:00:09.457663224 +0000 UTC m=+4821.579862482" watchObservedRunningTime="2026-03-19 18:00:09.46042947 +0000 UTC m=+4821.582628718" Mar 19 18:00:10 crc kubenswrapper[4918]: I0319 18:00:10.477894 4918 generic.go:334] "Generic (PLEG): container finished" podID="d390c826-1356-447f-b1da-72f57e7accbc" containerID="e76f4b19dcc4df6683b8faa6aa2471c0b84b2c8416b224f2c04e6e26abfdba89" exitCode=0 Mar 19 18:00:10 crc kubenswrapper[4918]: I0319 18:00:10.477973 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565720-4pr9h" event={"ID":"d390c826-1356-447f-b1da-72f57e7accbc","Type":"ContainerDied","Data":"e76f4b19dcc4df6683b8faa6aa2471c0b84b2c8416b224f2c04e6e26abfdba89"} Mar 19 18:00:12 crc kubenswrapper[4918]: I0319 18:00:12.842407 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565720-4pr9h" Mar 19 18:00:12 crc kubenswrapper[4918]: I0319 18:00:12.989570 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjht6\" (UniqueName: \"kubernetes.io/projected/d390c826-1356-447f-b1da-72f57e7accbc-kube-api-access-fjht6\") pod \"d390c826-1356-447f-b1da-72f57e7accbc\" (UID: \"d390c826-1356-447f-b1da-72f57e7accbc\") " Mar 19 18:00:13 crc kubenswrapper[4918]: I0319 18:00:13.011225 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d390c826-1356-447f-b1da-72f57e7accbc-kube-api-access-fjht6" (OuterVolumeSpecName: "kube-api-access-fjht6") pod "d390c826-1356-447f-b1da-72f57e7accbc" (UID: "d390c826-1356-447f-b1da-72f57e7accbc"). InnerVolumeSpecName "kube-api-access-fjht6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:00:13 crc kubenswrapper[4918]: I0319 18:00:13.093251 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjht6\" (UniqueName: \"kubernetes.io/projected/d390c826-1356-447f-b1da-72f57e7accbc-kube-api-access-fjht6\") on node \"crc\" DevicePath \"\"" Mar 19 18:00:13 crc kubenswrapper[4918]: I0319 18:00:13.502647 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565720-4pr9h" event={"ID":"d390c826-1356-447f-b1da-72f57e7accbc","Type":"ContainerDied","Data":"c0236fd77cb3bfc36d865dece026ee35e4c2588770aeda09a4f3fa90c1810c01"} Mar 19 18:00:13 crc kubenswrapper[4918]: I0319 18:00:13.502686 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0236fd77cb3bfc36d865dece026ee35e4c2588770aeda09a4f3fa90c1810c01" Mar 19 18:00:13 crc kubenswrapper[4918]: I0319 18:00:13.502734 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565720-4pr9h" Mar 19 18:00:13 crc kubenswrapper[4918]: I0319 18:00:13.899177 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565714-q44t8"] Mar 19 18:00:13 crc kubenswrapper[4918]: I0319 18:00:13.909051 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565714-q44t8"] Mar 19 18:00:14 crc kubenswrapper[4918]: I0319 18:00:14.599738 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d084b21-34e1-4a69-ab09-1bb2778a2cf6" path="/var/lib/kubelet/pods/0d084b21-34e1-4a69-ab09-1bb2778a2cf6/volumes" Mar 19 18:00:15 crc kubenswrapper[4918]: I0319 18:00:15.822820 4918 scope.go:117] "RemoveContainer" containerID="6da16dc3e579b80b8736e16a0fb2f1d544e642f1704274b7f45275a8008a45a9" Mar 19 18:00:28 crc kubenswrapper[4918]: I0319 18:00:28.211980 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:00:28 crc kubenswrapper[4918]: I0319 18:00:28.213599 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:00:58 crc kubenswrapper[4918]: I0319 18:00:58.212046 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:00:58 crc kubenswrapper[4918]: I0319 18:00:58.212560 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:00:58 crc kubenswrapper[4918]: I0319 18:00:58.212606 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 18:00:58 crc kubenswrapper[4918]: I0319 18:00:58.213314 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"683e2842523828a0b01acdee58435737f1109527ab15ad1352239626a83f2287"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 18:00:58 crc kubenswrapper[4918]: I0319 18:00:58.213357 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://683e2842523828a0b01acdee58435737f1109527ab15ad1352239626a83f2287" gracePeriod=600 Mar 19 18:00:58 crc kubenswrapper[4918]: I0319 18:00:58.883660 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="683e2842523828a0b01acdee58435737f1109527ab15ad1352239626a83f2287" exitCode=0 Mar 19 18:00:58 crc kubenswrapper[4918]: I0319 18:00:58.883730 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"683e2842523828a0b01acdee58435737f1109527ab15ad1352239626a83f2287"} Mar 19 18:00:58 crc kubenswrapper[4918]: I0319 18:00:58.884493 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8"} Mar 19 18:00:58 crc kubenswrapper[4918]: I0319 18:00:58.884581 4918 scope.go:117] "RemoveContainer" containerID="3f04f0e3539ef1bb1dee1d8030ed343b58f75c24af70d98809e037cf2b2004d3" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.158208 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29565721-vvmng"] Mar 19 18:01:00 crc kubenswrapper[4918]: E0319 18:01:00.160090 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d72809d-a071-401b-a44f-afdcddb39424" containerName="collect-profiles" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.160110 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d72809d-a071-401b-a44f-afdcddb39424" containerName="collect-profiles" Mar 19 18:01:00 crc kubenswrapper[4918]: E0319 18:01:00.160128 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d390c826-1356-447f-b1da-72f57e7accbc" containerName="oc" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.160139 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d390c826-1356-447f-b1da-72f57e7accbc" containerName="oc" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.160341 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d390c826-1356-447f-b1da-72f57e7accbc" containerName="oc" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.160356 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d72809d-a071-401b-a44f-afdcddb39424" containerName="collect-profiles" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.161135 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.167663 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565721-vvmng"] Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.206871 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-combined-ca-bundle\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.206980 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-config-data\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.207009 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d72j\" (UniqueName: \"kubernetes.io/projected/a4a52a09-c672-45bd-b5ed-f5794ac59da9-kube-api-access-7d72j\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.207041 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-fernet-keys\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.309593 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-combined-ca-bundle\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.309682 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-config-data\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.309714 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d72j\" (UniqueName: \"kubernetes.io/projected/a4a52a09-c672-45bd-b5ed-f5794ac59da9-kube-api-access-7d72j\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.309742 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-fernet-keys\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.314873 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-fernet-keys\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.316216 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-combined-ca-bundle\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.319952 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-config-data\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.331643 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d72j\" (UniqueName: \"kubernetes.io/projected/a4a52a09-c672-45bd-b5ed-f5794ac59da9-kube-api-access-7d72j\") pod \"keystone-cron-29565721-vvmng\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:00 crc kubenswrapper[4918]: I0319 18:01:00.481713 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:01 crc kubenswrapper[4918]: I0319 18:01:01.308727 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565721-vvmng"] Mar 19 18:01:01 crc kubenswrapper[4918]: I0319 18:01:01.935208 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565721-vvmng" event={"ID":"a4a52a09-c672-45bd-b5ed-f5794ac59da9","Type":"ContainerStarted","Data":"22afbae421c8ad5a85f3fbe5aac1cc0a9ee3bb3cb3442f966af67de883f08e81"} Mar 19 18:01:01 crc kubenswrapper[4918]: I0319 18:01:01.935560 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565721-vvmng" event={"ID":"a4a52a09-c672-45bd-b5ed-f5794ac59da9","Type":"ContainerStarted","Data":"ba6e90611fac54dc20401e75eabc1667cff770e6e0cfea8c6d956d27d7c4132c"} Mar 19 18:01:01 crc kubenswrapper[4918]: I0319 18:01:01.975677 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29565721-vvmng" podStartSLOduration=1.975657318 podStartE2EDuration="1.975657318s" podCreationTimestamp="2026-03-19 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:01:01.965026967 +0000 UTC m=+4874.087226205" watchObservedRunningTime="2026-03-19 18:01:01.975657318 +0000 UTC m=+4874.097856566" Mar 19 18:01:06 crc kubenswrapper[4918]: I0319 18:01:06.985475 4918 generic.go:334] "Generic (PLEG): container finished" podID="a4a52a09-c672-45bd-b5ed-f5794ac59da9" containerID="22afbae421c8ad5a85f3fbe5aac1cc0a9ee3bb3cb3442f966af67de883f08e81" exitCode=0 Mar 19 18:01:06 crc kubenswrapper[4918]: I0319 18:01:06.985797 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565721-vvmng" event={"ID":"a4a52a09-c672-45bd-b5ed-f5794ac59da9","Type":"ContainerDied","Data":"22afbae421c8ad5a85f3fbe5aac1cc0a9ee3bb3cb3442f966af67de883f08e81"} Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.106186 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.269088 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d72j\" (UniqueName: \"kubernetes.io/projected/a4a52a09-c672-45bd-b5ed-f5794ac59da9-kube-api-access-7d72j\") pod \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.269309 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-combined-ca-bundle\") pod \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.269362 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-fernet-keys\") pod \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.269395 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-config-data\") pod \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\" (UID: \"a4a52a09-c672-45bd-b5ed-f5794ac59da9\") " Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.277376 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a4a52a09-c672-45bd-b5ed-f5794ac59da9" (UID: "a4a52a09-c672-45bd-b5ed-f5794ac59da9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.293758 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a52a09-c672-45bd-b5ed-f5794ac59da9-kube-api-access-7d72j" (OuterVolumeSpecName: "kube-api-access-7d72j") pod "a4a52a09-c672-45bd-b5ed-f5794ac59da9" (UID: "a4a52a09-c672-45bd-b5ed-f5794ac59da9"). InnerVolumeSpecName "kube-api-access-7d72j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.326681 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a52a09-c672-45bd-b5ed-f5794ac59da9" (UID: "a4a52a09-c672-45bd-b5ed-f5794ac59da9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.371855 4918 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.371889 4918 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.371898 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d72j\" (UniqueName: \"kubernetes.io/projected/a4a52a09-c672-45bd-b5ed-f5794ac59da9-kube-api-access-7d72j\") on node \"crc\" DevicePath \"\"" Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.378711 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-config-data" (OuterVolumeSpecName: "config-data") pod "a4a52a09-c672-45bd-b5ed-f5794ac59da9" (UID: "a4a52a09-c672-45bd-b5ed-f5794ac59da9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:01:09 crc kubenswrapper[4918]: I0319 18:01:09.473959 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a52a09-c672-45bd-b5ed-f5794ac59da9-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 18:01:10 crc kubenswrapper[4918]: I0319 18:01:10.011321 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565721-vvmng" event={"ID":"a4a52a09-c672-45bd-b5ed-f5794ac59da9","Type":"ContainerDied","Data":"ba6e90611fac54dc20401e75eabc1667cff770e6e0cfea8c6d956d27d7c4132c"} Mar 19 18:01:10 crc kubenswrapper[4918]: I0319 18:01:10.011590 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba6e90611fac54dc20401e75eabc1667cff770e6e0cfea8c6d956d27d7c4132c" Mar 19 18:01:10 crc kubenswrapper[4918]: I0319 18:01:10.011388 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565721-vvmng" Mar 19 18:01:15 crc kubenswrapper[4918]: I0319 18:01:15.880451 4918 scope.go:117] "RemoveContainer" containerID="40ca7fcfedcfc6fcc7542b39ec7973432000929fafe730fdcbf3684185baef40" Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.143537 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565722-6jjkw"] Mar 19 18:02:00 crc kubenswrapper[4918]: E0319 18:02:00.144358 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a52a09-c672-45bd-b5ed-f5794ac59da9" containerName="keystone-cron" Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.144372 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a52a09-c672-45bd-b5ed-f5794ac59da9" containerName="keystone-cron" Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.144590 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a52a09-c672-45bd-b5ed-f5794ac59da9" containerName="keystone-cron" Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.145288 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565722-6jjkw" Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.152832 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.153499 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.153789 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565722-6jjkw"] Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.155918 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.216030 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv6fd\" (UniqueName: \"kubernetes.io/projected/41d3562a-1119-409a-9c26-fd901a6c06ef-kube-api-access-wv6fd\") pod \"auto-csr-approver-29565722-6jjkw\" (UID: \"41d3562a-1119-409a-9c26-fd901a6c06ef\") " pod="openshift-infra/auto-csr-approver-29565722-6jjkw" Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.317780 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv6fd\" (UniqueName: \"kubernetes.io/projected/41d3562a-1119-409a-9c26-fd901a6c06ef-kube-api-access-wv6fd\") pod \"auto-csr-approver-29565722-6jjkw\" (UID: \"41d3562a-1119-409a-9c26-fd901a6c06ef\") " pod="openshift-infra/auto-csr-approver-29565722-6jjkw" Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.340276 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv6fd\" (UniqueName: \"kubernetes.io/projected/41d3562a-1119-409a-9c26-fd901a6c06ef-kube-api-access-wv6fd\") pod \"auto-csr-approver-29565722-6jjkw\" (UID: \"41d3562a-1119-409a-9c26-fd901a6c06ef\") " pod="openshift-infra/auto-csr-approver-29565722-6jjkw" Mar 19 18:02:00 crc kubenswrapper[4918]: I0319 18:02:00.466325 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565722-6jjkw" Mar 19 18:02:02 crc kubenswrapper[4918]: I0319 18:02:01.356436 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565722-6jjkw"] Mar 19 18:02:02 crc kubenswrapper[4918]: I0319 18:02:02.584819 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565722-6jjkw" event={"ID":"41d3562a-1119-409a-9c26-fd901a6c06ef","Type":"ContainerStarted","Data":"13aa660da2b95d8bf34d6d643ed0f8344c80f5dd9101f863cc73ae672dcd8b26"} Mar 19 18:02:04 crc kubenswrapper[4918]: I0319 18:02:04.603151 4918 generic.go:334] "Generic (PLEG): container finished" podID="41d3562a-1119-409a-9c26-fd901a6c06ef" containerID="58b6dcf9f2195622b45eac9e9772e5792fa15a5cbed9071ed4ef8f9d986aeb78" exitCode=0 Mar 19 18:02:04 crc kubenswrapper[4918]: I0319 18:02:04.603626 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565722-6jjkw" event={"ID":"41d3562a-1119-409a-9c26-fd901a6c06ef","Type":"ContainerDied","Data":"58b6dcf9f2195622b45eac9e9772e5792fa15a5cbed9071ed4ef8f9d986aeb78"} Mar 19 18:02:06 crc kubenswrapper[4918]: I0319 18:02:06.929201 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565722-6jjkw" Mar 19 18:02:07 crc kubenswrapper[4918]: I0319 18:02:07.065016 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv6fd\" (UniqueName: \"kubernetes.io/projected/41d3562a-1119-409a-9c26-fd901a6c06ef-kube-api-access-wv6fd\") pod \"41d3562a-1119-409a-9c26-fd901a6c06ef\" (UID: \"41d3562a-1119-409a-9c26-fd901a6c06ef\") " Mar 19 18:02:07 crc kubenswrapper[4918]: I0319 18:02:07.090053 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d3562a-1119-409a-9c26-fd901a6c06ef-kube-api-access-wv6fd" (OuterVolumeSpecName: "kube-api-access-wv6fd") pod "41d3562a-1119-409a-9c26-fd901a6c06ef" (UID: "41d3562a-1119-409a-9c26-fd901a6c06ef"). InnerVolumeSpecName "kube-api-access-wv6fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:02:07 crc kubenswrapper[4918]: I0319 18:02:07.167656 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv6fd\" (UniqueName: \"kubernetes.io/projected/41d3562a-1119-409a-9c26-fd901a6c06ef-kube-api-access-wv6fd\") on node \"crc\" DevicePath \"\"" Mar 19 18:02:07 crc kubenswrapper[4918]: I0319 18:02:07.632269 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565722-6jjkw" event={"ID":"41d3562a-1119-409a-9c26-fd901a6c06ef","Type":"ContainerDied","Data":"13aa660da2b95d8bf34d6d643ed0f8344c80f5dd9101f863cc73ae672dcd8b26"} Mar 19 18:02:07 crc kubenswrapper[4918]: I0319 18:02:07.632313 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13aa660da2b95d8bf34d6d643ed0f8344c80f5dd9101f863cc73ae672dcd8b26" Mar 19 18:02:07 crc kubenswrapper[4918]: I0319 18:02:07.632329 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565722-6jjkw" Mar 19 18:02:08 crc kubenswrapper[4918]: I0319 18:02:08.005211 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565716-mjv7f"] Mar 19 18:02:08 crc kubenswrapper[4918]: I0319 18:02:08.018376 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565716-mjv7f"] Mar 19 18:02:08 crc kubenswrapper[4918]: I0319 18:02:08.597954 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a048bf2d-db47-4755-9ad4-309a86be6f8c" path="/var/lib/kubelet/pods/a048bf2d-db47-4755-9ad4-309a86be6f8c/volumes" Mar 19 18:02:13 crc kubenswrapper[4918]: I0319 18:02:13.680797 4918 generic.go:334] "Generic (PLEG): container finished" podID="26f468b4-9955-436c-810a-cff9e17a1063" containerID="91f53fc26d2d00f2d4b0da8d89d3bc9378dffb313b6837b71b186f8c75dd5881" exitCode=0 Mar 19 18:02:13 crc kubenswrapper[4918]: I0319 18:02:13.681356 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"26f468b4-9955-436c-810a-cff9e17a1063","Type":"ContainerDied","Data":"91f53fc26d2d00f2d4b0da8d89d3bc9378dffb313b6837b71b186f8c75dd5881"} Mar 19 18:02:15 crc kubenswrapper[4918]: I0319 18:02:15.945361 4918 scope.go:117] "RemoveContainer" containerID="e2af64416893745b6ec19835b53582ed72518c6dbd8a18832bdc6451ce16c946" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.032011 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.143611 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ssh-key\") pod \"26f468b4-9955-436c-810a-cff9e17a1063\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.143679 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"26f468b4-9955-436c-810a-cff9e17a1063\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.143851 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-workdir\") pod \"26f468b4-9955-436c-810a-cff9e17a1063\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.143930 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ca-certs\") pod \"26f468b4-9955-436c-810a-cff9e17a1063\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.143990 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config\") pod \"26f468b4-9955-436c-810a-cff9e17a1063\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.144057 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-temporary\") pod \"26f468b4-9955-436c-810a-cff9e17a1063\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.144115 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-config-data\") pod \"26f468b4-9955-436c-810a-cff9e17a1063\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.144177 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config-secret\") pod \"26f468b4-9955-436c-810a-cff9e17a1063\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.144238 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwgtk\" (UniqueName: \"kubernetes.io/projected/26f468b4-9955-436c-810a-cff9e17a1063-kube-api-access-rwgtk\") pod \"26f468b4-9955-436c-810a-cff9e17a1063\" (UID: \"26f468b4-9955-436c-810a-cff9e17a1063\") " Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.144548 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "26f468b4-9955-436c-810a-cff9e17a1063" (UID: "26f468b4-9955-436c-810a-cff9e17a1063"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.144938 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-config-data" (OuterVolumeSpecName: "config-data") pod "26f468b4-9955-436c-810a-cff9e17a1063" (UID: "26f468b4-9955-436c-810a-cff9e17a1063"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.145096 4918 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.145110 4918 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.162751 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "26f468b4-9955-436c-810a-cff9e17a1063" (UID: "26f468b4-9955-436c-810a-cff9e17a1063"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.166707 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f468b4-9955-436c-810a-cff9e17a1063-kube-api-access-rwgtk" (OuterVolumeSpecName: "kube-api-access-rwgtk") pod "26f468b4-9955-436c-810a-cff9e17a1063" (UID: "26f468b4-9955-436c-810a-cff9e17a1063"). InnerVolumeSpecName "kube-api-access-rwgtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.238082 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "26f468b4-9955-436c-810a-cff9e17a1063" (UID: "26f468b4-9955-436c-810a-cff9e17a1063"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.238193 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "26f468b4-9955-436c-810a-cff9e17a1063" (UID: "26f468b4-9955-436c-810a-cff9e17a1063"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.241350 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "26f468b4-9955-436c-810a-cff9e17a1063" (UID: "26f468b4-9955-436c-810a-cff9e17a1063"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.248163 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwgtk\" (UniqueName: \"kubernetes.io/projected/26f468b4-9955-436c-810a-cff9e17a1063-kube-api-access-rwgtk\") on node \"crc\" DevicePath \"\"" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.248199 4918 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.248230 4918 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.248248 4918 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.248262 4918 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.250710 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "26f468b4-9955-436c-810a-cff9e17a1063" (UID: "26f468b4-9955-436c-810a-cff9e17a1063"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.274407 4918 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.349842 4918 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/26f468b4-9955-436c-810a-cff9e17a1063-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.349874 4918 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.566159 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "26f468b4-9955-436c-810a-cff9e17a1063" (UID: "26f468b4-9955-436c-810a-cff9e17a1063"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.655878 4918 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/26f468b4-9955-436c-810a-cff9e17a1063-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.705004 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"26f468b4-9955-436c-810a-cff9e17a1063","Type":"ContainerDied","Data":"70363227ac59cd1e4a9347a98b92241cb8a471b3fb23e87285eddd2cf4528f17"} Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.705049 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70363227ac59cd1e4a9347a98b92241cb8a471b3fb23e87285eddd2cf4528f17" Mar 19 18:02:16 crc kubenswrapper[4918]: I0319 18:02:16.705104 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.593457 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 18:02:23 crc kubenswrapper[4918]: E0319 18:02:23.594265 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f468b4-9955-436c-810a-cff9e17a1063" containerName="tempest-tests-tempest-tests-runner" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.594277 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f468b4-9955-436c-810a-cff9e17a1063" containerName="tempest-tests-tempest-tests-runner" Mar 19 18:02:23 crc kubenswrapper[4918]: E0319 18:02:23.594311 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d3562a-1119-409a-9c26-fd901a6c06ef" containerName="oc" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.594316 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d3562a-1119-409a-9c26-fd901a6c06ef" containerName="oc" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.594514 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d3562a-1119-409a-9c26-fd901a6c06ef" containerName="oc" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.594542 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="26f468b4-9955-436c-810a-cff9e17a1063" containerName="tempest-tests-tempest-tests-runner" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.595433 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.598611 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cbnk9" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.602606 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.695715 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjztd\" (UniqueName: \"kubernetes.io/projected/0548bf8b-39fd-4ee4-95d7-d454a5269a39-kube-api-access-zjztd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0548bf8b-39fd-4ee4-95d7-d454a5269a39\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.695810 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0548bf8b-39fd-4ee4-95d7-d454a5269a39\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.797738 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0548bf8b-39fd-4ee4-95d7-d454a5269a39\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.798013 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjztd\" (UniqueName: \"kubernetes.io/projected/0548bf8b-39fd-4ee4-95d7-d454a5269a39-kube-api-access-zjztd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0548bf8b-39fd-4ee4-95d7-d454a5269a39\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.798225 4918 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0548bf8b-39fd-4ee4-95d7-d454a5269a39\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.829548 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0548bf8b-39fd-4ee4-95d7-d454a5269a39\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.830134 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjztd\" (UniqueName: \"kubernetes.io/projected/0548bf8b-39fd-4ee4-95d7-d454a5269a39-kube-api-access-zjztd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0548bf8b-39fd-4ee4-95d7-d454a5269a39\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 18:02:23 crc kubenswrapper[4918]: I0319 18:02:23.926718 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 18:02:24 crc kubenswrapper[4918]: I0319 18:02:24.759842 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 18:02:24 crc kubenswrapper[4918]: I0319 18:02:24.782469 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0548bf8b-39fd-4ee4-95d7-d454a5269a39","Type":"ContainerStarted","Data":"b55ce21416d32362c3cbb21bf2972fea0bd4175f330c9d67375526008734891c"} Mar 19 18:02:26 crc kubenswrapper[4918]: I0319 18:02:26.834494 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0548bf8b-39fd-4ee4-95d7-d454a5269a39","Type":"ContainerStarted","Data":"14c6ff5e783407e0d636d38cb3d91e2ee87c12c2bee5d6729bd80fa43d286740"} Mar 19 18:02:26 crc kubenswrapper[4918]: I0319 18:02:26.866422 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.51113697 podStartE2EDuration="3.866400229s" podCreationTimestamp="2026-03-19 18:02:23 +0000 UTC" firstStartedPulling="2026-03-19 18:02:24.769064303 +0000 UTC m=+4956.891263551" lastFinishedPulling="2026-03-19 18:02:26.124327562 +0000 UTC m=+4958.246526810" observedRunningTime="2026-03-19 18:02:26.848935331 +0000 UTC m=+4958.971134579" watchObservedRunningTime="2026-03-19 18:02:26.866400229 +0000 UTC m=+4958.988599477" Mar 19 18:02:58 crc kubenswrapper[4918]: I0319 18:02:58.212392 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:02:58 crc kubenswrapper[4918]: I0319 18:02:58.213043 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.157622 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-584bk/must-gather-tdc4n"] Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.191045 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/must-gather-tdc4n" Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.195407 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-584bk"/"default-dockercfg-vl6sl" Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.195761 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-584bk"/"openshift-service-ca.crt" Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.195960 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-584bk"/"kube-root-ca.crt" Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.225111 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-584bk/must-gather-tdc4n"] Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.327685 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jv5h\" (UniqueName: \"kubernetes.io/projected/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-kube-api-access-9jv5h\") pod \"must-gather-tdc4n\" (UID: \"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0\") " pod="openshift-must-gather-584bk/must-gather-tdc4n" Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.327836 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-must-gather-output\") pod \"must-gather-tdc4n\" (UID: \"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0\") " pod="openshift-must-gather-584bk/must-gather-tdc4n" Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.429550 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-must-gather-output\") pod \"must-gather-tdc4n\" (UID: \"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0\") " pod="openshift-must-gather-584bk/must-gather-tdc4n" Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.429774 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jv5h\" (UniqueName: \"kubernetes.io/projected/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-kube-api-access-9jv5h\") pod \"must-gather-tdc4n\" (UID: \"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0\") " pod="openshift-must-gather-584bk/must-gather-tdc4n" Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.430114 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-must-gather-output\") pod \"must-gather-tdc4n\" (UID: \"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0\") " pod="openshift-must-gather-584bk/must-gather-tdc4n" Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.507754 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jv5h\" (UniqueName: \"kubernetes.io/projected/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-kube-api-access-9jv5h\") pod \"must-gather-tdc4n\" (UID: \"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0\") " pod="openshift-must-gather-584bk/must-gather-tdc4n" Mar 19 18:03:22 crc kubenswrapper[4918]: I0319 18:03:22.538590 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/must-gather-tdc4n" Mar 19 18:03:23 crc kubenswrapper[4918]: I0319 18:03:23.373604 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-584bk/must-gather-tdc4n"] Mar 19 18:03:24 crc kubenswrapper[4918]: I0319 18:03:24.391332 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-584bk/must-gather-tdc4n" event={"ID":"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0","Type":"ContainerStarted","Data":"a341364326d95f84fa46836e40cf7bca84c56d8f058a5f4a33d1005ad19024f5"} Mar 19 18:03:28 crc kubenswrapper[4918]: I0319 18:03:28.212138 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:03:28 crc kubenswrapper[4918]: I0319 18:03:28.212579 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:03:39 crc kubenswrapper[4918]: I0319 18:03:39.543070 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-584bk/must-gather-tdc4n" event={"ID":"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0","Type":"ContainerStarted","Data":"ba78511a239a8c6dac838187e939394f0e06b389cb5b4b1d5f86f7bffa2bfecf"} Mar 19 18:03:39 crc kubenswrapper[4918]: I0319 18:03:39.543726 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-584bk/must-gather-tdc4n" event={"ID":"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0","Type":"ContainerStarted","Data":"e2ef8149401114cc0d7930e8c2dbb5d5333e74c3bb6a72de6c7cd31bcb5912f4"} Mar 19 18:03:39 crc kubenswrapper[4918]: I0319 18:03:39.561992 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-584bk/must-gather-tdc4n" podStartSLOduration=2.498683187 podStartE2EDuration="17.561977514s" podCreationTimestamp="2026-03-19 18:03:22 +0000 UTC" firstStartedPulling="2026-03-19 18:03:23.38284349 +0000 UTC m=+5015.505042738" lastFinishedPulling="2026-03-19 18:03:38.446137817 +0000 UTC m=+5030.568337065" observedRunningTime="2026-03-19 18:03:39.56144038 +0000 UTC m=+5031.683639628" watchObservedRunningTime="2026-03-19 18:03:39.561977514 +0000 UTC m=+5031.684176762" Mar 19 18:03:46 crc kubenswrapper[4918]: I0319 18:03:46.104604 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-584bk/crc-debug-wvr5n"] Mar 19 18:03:46 crc kubenswrapper[4918]: I0319 18:03:46.110406 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-wvr5n" Mar 19 18:03:46 crc kubenswrapper[4918]: I0319 18:03:46.228722 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw2zr\" (UniqueName: \"kubernetes.io/projected/9729f15b-b235-4670-9431-ec8a60b39e4d-kube-api-access-rw2zr\") pod \"crc-debug-wvr5n\" (UID: \"9729f15b-b235-4670-9431-ec8a60b39e4d\") " pod="openshift-must-gather-584bk/crc-debug-wvr5n" Mar 19 18:03:46 crc kubenswrapper[4918]: I0319 18:03:46.228763 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9729f15b-b235-4670-9431-ec8a60b39e4d-host\") pod \"crc-debug-wvr5n\" (UID: \"9729f15b-b235-4670-9431-ec8a60b39e4d\") " pod="openshift-must-gather-584bk/crc-debug-wvr5n" Mar 19 18:03:46 crc kubenswrapper[4918]: I0319 18:03:46.330678 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw2zr\" (UniqueName: \"kubernetes.io/projected/9729f15b-b235-4670-9431-ec8a60b39e4d-kube-api-access-rw2zr\") pod \"crc-debug-wvr5n\" (UID: \"9729f15b-b235-4670-9431-ec8a60b39e4d\") " pod="openshift-must-gather-584bk/crc-debug-wvr5n" Mar 19 18:03:46 crc kubenswrapper[4918]: I0319 18:03:46.330728 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9729f15b-b235-4670-9431-ec8a60b39e4d-host\") pod \"crc-debug-wvr5n\" (UID: \"9729f15b-b235-4670-9431-ec8a60b39e4d\") " pod="openshift-must-gather-584bk/crc-debug-wvr5n" Mar 19 18:03:46 crc kubenswrapper[4918]: I0319 18:03:46.330966 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9729f15b-b235-4670-9431-ec8a60b39e4d-host\") pod \"crc-debug-wvr5n\" (UID: \"9729f15b-b235-4670-9431-ec8a60b39e4d\") " pod="openshift-must-gather-584bk/crc-debug-wvr5n" Mar 19 18:03:46 crc kubenswrapper[4918]: I0319 18:03:46.356849 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw2zr\" (UniqueName: \"kubernetes.io/projected/9729f15b-b235-4670-9431-ec8a60b39e4d-kube-api-access-rw2zr\") pod \"crc-debug-wvr5n\" (UID: \"9729f15b-b235-4670-9431-ec8a60b39e4d\") " pod="openshift-must-gather-584bk/crc-debug-wvr5n" Mar 19 18:03:46 crc kubenswrapper[4918]: I0319 18:03:46.428858 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-wvr5n" Mar 19 18:03:46 crc kubenswrapper[4918]: W0319 18:03:46.480288 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9729f15b_b235_4670_9431_ec8a60b39e4d.slice/crio-11a5494fa711f0e001e4aee110e887ce33d2196f015454d12a636968f7a72043 WatchSource:0}: Error finding container 11a5494fa711f0e001e4aee110e887ce33d2196f015454d12a636968f7a72043: Status 404 returned error can't find the container with id 11a5494fa711f0e001e4aee110e887ce33d2196f015454d12a636968f7a72043 Mar 19 18:03:46 crc kubenswrapper[4918]: I0319 18:03:46.597666 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-584bk/crc-debug-wvr5n" event={"ID":"9729f15b-b235-4670-9431-ec8a60b39e4d","Type":"ContainerStarted","Data":"11a5494fa711f0e001e4aee110e887ce33d2196f015454d12a636968f7a72043"} Mar 19 18:03:58 crc kubenswrapper[4918]: I0319 18:03:58.211660 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:03:58 crc kubenswrapper[4918]: I0319 18:03:58.212155 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:03:58 crc kubenswrapper[4918]: I0319 18:03:58.212196 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 18:03:58 crc kubenswrapper[4918]: I0319 18:03:58.213035 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 18:03:58 crc kubenswrapper[4918]: I0319 18:03:58.213093 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" gracePeriod=600 Mar 19 18:03:58 crc kubenswrapper[4918]: I0319 18:03:58.737335 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" exitCode=0 Mar 19 18:03:58 crc kubenswrapper[4918]: I0319 18:03:58.737396 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8"} Mar 19 18:03:58 crc kubenswrapper[4918]: I0319 18:03:58.737618 4918 scope.go:117] "RemoveContainer" containerID="683e2842523828a0b01acdee58435737f1109527ab15ad1352239626a83f2287" Mar 19 18:04:00 crc kubenswrapper[4918]: I0319 18:04:00.165019 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565724-mrmnb"] Mar 19 18:04:00 crc kubenswrapper[4918]: I0319 18:04:00.166767 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565724-mrmnb" Mar 19 18:04:00 crc kubenswrapper[4918]: I0319 18:04:00.174192 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 18:04:00 crc kubenswrapper[4918]: I0319 18:04:00.174387 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:04:00 crc kubenswrapper[4918]: I0319 18:04:00.174487 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:04:00 crc kubenswrapper[4918]: I0319 18:04:00.185802 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565724-mrmnb"] Mar 19 18:04:00 crc kubenswrapper[4918]: I0319 18:04:00.234675 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmmxk\" (UniqueName: \"kubernetes.io/projected/f306c4a3-fdd1-434c-b4cc-9847c22a3e89-kube-api-access-mmmxk\") pod \"auto-csr-approver-29565724-mrmnb\" (UID: \"f306c4a3-fdd1-434c-b4cc-9847c22a3e89\") " pod="openshift-infra/auto-csr-approver-29565724-mrmnb" Mar 19 18:04:00 crc kubenswrapper[4918]: I0319 18:04:00.338843 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmmxk\" (UniqueName: \"kubernetes.io/projected/f306c4a3-fdd1-434c-b4cc-9847c22a3e89-kube-api-access-mmmxk\") pod \"auto-csr-approver-29565724-mrmnb\" (UID: \"f306c4a3-fdd1-434c-b4cc-9847c22a3e89\") " pod="openshift-infra/auto-csr-approver-29565724-mrmnb" Mar 19 18:04:00 crc kubenswrapper[4918]: I0319 18:04:00.356749 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmmxk\" (UniqueName: \"kubernetes.io/projected/f306c4a3-fdd1-434c-b4cc-9847c22a3e89-kube-api-access-mmmxk\") pod \"auto-csr-approver-29565724-mrmnb\" (UID: \"f306c4a3-fdd1-434c-b4cc-9847c22a3e89\") " pod="openshift-infra/auto-csr-approver-29565724-mrmnb" Mar 19 18:04:00 crc kubenswrapper[4918]: I0319 18:04:00.511939 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565724-mrmnb" Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.085071 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ppm5j"] Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.088670 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.109145 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppm5j"] Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.129220 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-utilities\") pod \"redhat-operators-ppm5j\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.129294 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7tlx\" (UniqueName: \"kubernetes.io/projected/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-kube-api-access-f7tlx\") pod \"redhat-operators-ppm5j\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.129397 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-catalog-content\") pod \"redhat-operators-ppm5j\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.231710 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-utilities\") pod \"redhat-operators-ppm5j\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.231804 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7tlx\" (UniqueName: \"kubernetes.io/projected/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-kube-api-access-f7tlx\") pod \"redhat-operators-ppm5j\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.231904 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-catalog-content\") pod \"redhat-operators-ppm5j\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.232192 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-utilities\") pod \"redhat-operators-ppm5j\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.232308 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-catalog-content\") pod \"redhat-operators-ppm5j\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.254748 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7tlx\" (UniqueName: \"kubernetes.io/projected/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-kube-api-access-f7tlx\") pod \"redhat-operators-ppm5j\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:04 crc kubenswrapper[4918]: I0319 18:04:04.442246 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:06 crc kubenswrapper[4918]: E0319 18:04:06.578847 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:04:06 crc kubenswrapper[4918]: E0319 18:04:06.614637 4918 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Mar 19 18:04:06 crc kubenswrapper[4918]: E0319 18:04:06.614777 4918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rw2zr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-wvr5n_openshift-must-gather-584bk(9729f15b-b235-4670-9431-ec8a60b39e4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 18:04:06 crc kubenswrapper[4918]: E0319 18:04:06.616838 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-584bk/crc-debug-wvr5n" podUID="9729f15b-b235-4670-9431-ec8a60b39e4d" Mar 19 18:04:06 crc kubenswrapper[4918]: I0319 18:04:06.858336 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:04:06 crc kubenswrapper[4918]: E0319 18:04:06.858933 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:04:06 crc kubenswrapper[4918]: E0319 18:04:06.882281 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-584bk/crc-debug-wvr5n" podUID="9729f15b-b235-4670-9431-ec8a60b39e4d" Mar 19 18:04:07 crc kubenswrapper[4918]: I0319 18:04:07.638440 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppm5j"] Mar 19 18:04:07 crc kubenswrapper[4918]: I0319 18:04:07.879904 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppm5j" event={"ID":"19c8ba21-1d3c-448f-85cf-414b4c1aca6e","Type":"ContainerStarted","Data":"9fde5135381457fab55c739a83c62735f37dbcf178036b394d63edb53811da7c"} Mar 19 18:04:08 crc kubenswrapper[4918]: I0319 18:04:08.317211 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565724-mrmnb"] Mar 19 18:04:08 crc kubenswrapper[4918]: I0319 18:04:08.891361 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565724-mrmnb" event={"ID":"f306c4a3-fdd1-434c-b4cc-9847c22a3e89","Type":"ContainerStarted","Data":"2f5fefd92709efb1565973092181501cf90fcbdf1a00ce6cdd266dd4710ae43c"} Mar 19 18:04:08 crc kubenswrapper[4918]: I0319 18:04:08.896089 4918 generic.go:334] "Generic (PLEG): container finished" podID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerID="957a8068abccfb7ab4507341fa62d7b4bc5c0d6fc460f0f79d1627b618bcf372" exitCode=0 Mar 19 18:04:08 crc kubenswrapper[4918]: I0319 18:04:08.896125 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppm5j" event={"ID":"19c8ba21-1d3c-448f-85cf-414b4c1aca6e","Type":"ContainerDied","Data":"957a8068abccfb7ab4507341fa62d7b4bc5c0d6fc460f0f79d1627b618bcf372"} Mar 19 18:04:10 crc kubenswrapper[4918]: I0319 18:04:10.920276 4918 generic.go:334] "Generic (PLEG): container finished" podID="f306c4a3-fdd1-434c-b4cc-9847c22a3e89" containerID="3260cf79358d36263154a35eb700bde10ddf4bf8c1cc6a138b31d6bcf0165c93" exitCode=0 Mar 19 18:04:10 crc kubenswrapper[4918]: I0319 18:04:10.920335 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565724-mrmnb" event={"ID":"f306c4a3-fdd1-434c-b4cc-9847c22a3e89","Type":"ContainerDied","Data":"3260cf79358d36263154a35eb700bde10ddf4bf8c1cc6a138b31d6bcf0165c93"} Mar 19 18:04:10 crc kubenswrapper[4918]: I0319 18:04:10.923810 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppm5j" event={"ID":"19c8ba21-1d3c-448f-85cf-414b4c1aca6e","Type":"ContainerStarted","Data":"487e62e5682d0cc8183be821157e19f1163e27e8b0b4a8d31c4c07497afeffb1"} Mar 19 18:04:12 crc kubenswrapper[4918]: I0319 18:04:12.951831 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565724-mrmnb" event={"ID":"f306c4a3-fdd1-434c-b4cc-9847c22a3e89","Type":"ContainerDied","Data":"2f5fefd92709efb1565973092181501cf90fcbdf1a00ce6cdd266dd4710ae43c"} Mar 19 18:04:12 crc kubenswrapper[4918]: I0319 18:04:12.952127 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f5fefd92709efb1565973092181501cf90fcbdf1a00ce6cdd266dd4710ae43c" Mar 19 18:04:13 crc kubenswrapper[4918]: I0319 18:04:13.033818 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565724-mrmnb" Mar 19 18:04:13 crc kubenswrapper[4918]: I0319 18:04:13.165776 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmmxk\" (UniqueName: \"kubernetes.io/projected/f306c4a3-fdd1-434c-b4cc-9847c22a3e89-kube-api-access-mmmxk\") pod \"f306c4a3-fdd1-434c-b4cc-9847c22a3e89\" (UID: \"f306c4a3-fdd1-434c-b4cc-9847c22a3e89\") " Mar 19 18:04:13 crc kubenswrapper[4918]: I0319 18:04:13.188397 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f306c4a3-fdd1-434c-b4cc-9847c22a3e89-kube-api-access-mmmxk" (OuterVolumeSpecName: "kube-api-access-mmmxk") pod "f306c4a3-fdd1-434c-b4cc-9847c22a3e89" (UID: "f306c4a3-fdd1-434c-b4cc-9847c22a3e89"). InnerVolumeSpecName "kube-api-access-mmmxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:04:13 crc kubenswrapper[4918]: I0319 18:04:13.268600 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmmxk\" (UniqueName: \"kubernetes.io/projected/f306c4a3-fdd1-434c-b4cc-9847c22a3e89-kube-api-access-mmmxk\") on node \"crc\" DevicePath \"\"" Mar 19 18:04:13 crc kubenswrapper[4918]: I0319 18:04:13.973241 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565724-mrmnb" Mar 19 18:04:14 crc kubenswrapper[4918]: I0319 18:04:14.134085 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565718-cwrrf"] Mar 19 18:04:14 crc kubenswrapper[4918]: I0319 18:04:14.151483 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565718-cwrrf"] Mar 19 18:04:14 crc kubenswrapper[4918]: I0319 18:04:14.597470 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315ac8a3-68b8-445f-9ed2-6c1e7c541456" path="/var/lib/kubelet/pods/315ac8a3-68b8-445f-9ed2-6c1e7c541456/volumes" Mar 19 18:04:16 crc kubenswrapper[4918]: I0319 18:04:16.080285 4918 scope.go:117] "RemoveContainer" containerID="514ab7679ede40f3ca2e4a469d4ca1f340e298ee7139e8907d13edf484ee5989" Mar 19 18:04:16 crc kubenswrapper[4918]: E0319 18:04:16.492318 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19c8ba21_1d3c_448f_85cf_414b4c1aca6e.slice/crio-conmon-487e62e5682d0cc8183be821157e19f1163e27e8b0b4a8d31c4c07497afeffb1.scope\": RecentStats: unable to find data in memory cache]" Mar 19 18:04:17 crc kubenswrapper[4918]: I0319 18:04:17.003450 4918 generic.go:334] "Generic (PLEG): container finished" podID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerID="487e62e5682d0cc8183be821157e19f1163e27e8b0b4a8d31c4c07497afeffb1" exitCode=0 Mar 19 18:04:17 crc kubenswrapper[4918]: I0319 18:04:17.003535 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppm5j" event={"ID":"19c8ba21-1d3c-448f-85cf-414b4c1aca6e","Type":"ContainerDied","Data":"487e62e5682d0cc8183be821157e19f1163e27e8b0b4a8d31c4c07497afeffb1"} Mar 19 18:04:18 crc kubenswrapper[4918]: I0319 18:04:18.014370 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppm5j" event={"ID":"19c8ba21-1d3c-448f-85cf-414b4c1aca6e","Type":"ContainerStarted","Data":"c7d28297d4da40b9cbbf911ae35aa8ae45dca13a8eb927490431bf96deadfd35"} Mar 19 18:04:18 crc kubenswrapper[4918]: I0319 18:04:18.036128 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ppm5j" podStartSLOduration=5.51341156 podStartE2EDuration="14.036107992s" podCreationTimestamp="2026-03-19 18:04:04 +0000 UTC" firstStartedPulling="2026-03-19 18:04:08.898764922 +0000 UTC m=+5061.020964170" lastFinishedPulling="2026-03-19 18:04:17.421461324 +0000 UTC m=+5069.543660602" observedRunningTime="2026-03-19 18:04:18.030974731 +0000 UTC m=+5070.153173989" watchObservedRunningTime="2026-03-19 18:04:18.036107992 +0000 UTC m=+5070.158307240" Mar 19 18:04:21 crc kubenswrapper[4918]: I0319 18:04:21.586755 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:04:21 crc kubenswrapper[4918]: E0319 18:04:21.587369 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:04:23 crc kubenswrapper[4918]: I0319 18:04:23.074998 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-584bk/crc-debug-wvr5n" event={"ID":"9729f15b-b235-4670-9431-ec8a60b39e4d","Type":"ContainerStarted","Data":"c9ab0de4d2d6b00e46f466b40a4d93b6d221c0e9eed14b06450bc30735ed79a8"} Mar 19 18:04:23 crc kubenswrapper[4918]: I0319 18:04:23.104959 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-584bk/crc-debug-wvr5n" podStartSLOduration=1.465090434 podStartE2EDuration="37.104938919s" podCreationTimestamp="2026-03-19 18:03:46 +0000 UTC" firstStartedPulling="2026-03-19 18:03:46.482817906 +0000 UTC m=+5038.605017164" lastFinishedPulling="2026-03-19 18:04:22.122666401 +0000 UTC m=+5074.244865649" observedRunningTime="2026-03-19 18:04:23.097439983 +0000 UTC m=+5075.219639241" watchObservedRunningTime="2026-03-19 18:04:23.104938919 +0000 UTC m=+5075.227138177" Mar 19 18:04:24 crc kubenswrapper[4918]: I0319 18:04:24.442447 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:24 crc kubenswrapper[4918]: I0319 18:04:24.442730 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:25 crc kubenswrapper[4918]: I0319 18:04:25.500772 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ppm5j" podUID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerName="registry-server" probeResult="failure" output=< Mar 19 18:04:25 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 18:04:25 crc kubenswrapper[4918]: > Mar 19 18:04:34 crc kubenswrapper[4918]: I0319 18:04:34.494380 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:34 crc kubenswrapper[4918]: I0319 18:04:34.552124 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:35 crc kubenswrapper[4918]: I0319 18:04:35.285791 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppm5j"] Mar 19 18:04:35 crc kubenswrapper[4918]: I0319 18:04:35.586318 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:04:35 crc kubenswrapper[4918]: E0319 18:04:35.586615 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:04:36 crc kubenswrapper[4918]: I0319 18:04:36.193017 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ppm5j" podUID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerName="registry-server" containerID="cri-o://c7d28297d4da40b9cbbf911ae35aa8ae45dca13a8eb927490431bf96deadfd35" gracePeriod=2 Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.202211 4918 generic.go:334] "Generic (PLEG): container finished" podID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerID="c7d28297d4da40b9cbbf911ae35aa8ae45dca13a8eb927490431bf96deadfd35" exitCode=0 Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.202295 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppm5j" event={"ID":"19c8ba21-1d3c-448f-85cf-414b4c1aca6e","Type":"ContainerDied","Data":"c7d28297d4da40b9cbbf911ae35aa8ae45dca13a8eb927490431bf96deadfd35"} Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.502338 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.557783 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7tlx\" (UniqueName: \"kubernetes.io/projected/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-kube-api-access-f7tlx\") pod \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.557999 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-catalog-content\") pod \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.558049 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-utilities\") pod \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\" (UID: \"19c8ba21-1d3c-448f-85cf-414b4c1aca6e\") " Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.559137 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-utilities" (OuterVolumeSpecName: "utilities") pod "19c8ba21-1d3c-448f-85cf-414b4c1aca6e" (UID: "19c8ba21-1d3c-448f-85cf-414b4c1aca6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.573215 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-kube-api-access-f7tlx" (OuterVolumeSpecName: "kube-api-access-f7tlx") pod "19c8ba21-1d3c-448f-85cf-414b4c1aca6e" (UID: "19c8ba21-1d3c-448f-85cf-414b4c1aca6e"). InnerVolumeSpecName "kube-api-access-f7tlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.666266 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.666326 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7tlx\" (UniqueName: \"kubernetes.io/projected/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-kube-api-access-f7tlx\") on node \"crc\" DevicePath \"\"" Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.739451 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19c8ba21-1d3c-448f-85cf-414b4c1aca6e" (UID: "19c8ba21-1d3c-448f-85cf-414b4c1aca6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:04:37 crc kubenswrapper[4918]: I0319 18:04:37.768257 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c8ba21-1d3c-448f-85cf-414b4c1aca6e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:04:38 crc kubenswrapper[4918]: I0319 18:04:38.213575 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppm5j" event={"ID":"19c8ba21-1d3c-448f-85cf-414b4c1aca6e","Type":"ContainerDied","Data":"9fde5135381457fab55c739a83c62735f37dbcf178036b394d63edb53811da7c"} Mar 19 18:04:38 crc kubenswrapper[4918]: I0319 18:04:38.213663 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppm5j" Mar 19 18:04:38 crc kubenswrapper[4918]: I0319 18:04:38.213942 4918 scope.go:117] "RemoveContainer" containerID="c7d28297d4da40b9cbbf911ae35aa8ae45dca13a8eb927490431bf96deadfd35" Mar 19 18:04:39 crc kubenswrapper[4918]: I0319 18:04:39.612928 4918 scope.go:117] "RemoveContainer" containerID="487e62e5682d0cc8183be821157e19f1163e27e8b0b4a8d31c4c07497afeffb1" Mar 19 18:04:39 crc kubenswrapper[4918]: I0319 18:04:39.633345 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppm5j"] Mar 19 18:04:39 crc kubenswrapper[4918]: I0319 18:04:39.643064 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ppm5j"] Mar 19 18:04:40 crc kubenswrapper[4918]: I0319 18:04:40.135504 4918 scope.go:117] "RemoveContainer" containerID="957a8068abccfb7ab4507341fa62d7b4bc5c0d6fc460f0f79d1627b618bcf372" Mar 19 18:04:40 crc kubenswrapper[4918]: I0319 18:04:40.599115 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" path="/var/lib/kubelet/pods/19c8ba21-1d3c-448f-85cf-414b4c1aca6e/volumes" Mar 19 18:04:48 crc kubenswrapper[4918]: I0319 18:04:48.592760 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:04:48 crc kubenswrapper[4918]: E0319 18:04:48.594165 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:05:01 crc kubenswrapper[4918]: I0319 18:05:01.587162 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:05:01 crc kubenswrapper[4918]: E0319 18:05:01.588036 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:05:15 crc kubenswrapper[4918]: I0319 18:05:15.586880 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:05:15 crc kubenswrapper[4918]: E0319 18:05:15.587678 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:05:23 crc kubenswrapper[4918]: I0319 18:05:23.644223 4918 generic.go:334] "Generic (PLEG): container finished" podID="9729f15b-b235-4670-9431-ec8a60b39e4d" containerID="c9ab0de4d2d6b00e46f466b40a4d93b6d221c0e9eed14b06450bc30735ed79a8" exitCode=0 Mar 19 18:05:23 crc kubenswrapper[4918]: I0319 18:05:23.644306 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-584bk/crc-debug-wvr5n" event={"ID":"9729f15b-b235-4670-9431-ec8a60b39e4d","Type":"ContainerDied","Data":"c9ab0de4d2d6b00e46f466b40a4d93b6d221c0e9eed14b06450bc30735ed79a8"} Mar 19 18:05:24 crc kubenswrapper[4918]: I0319 18:05:24.785173 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-wvr5n" Mar 19 18:05:24 crc kubenswrapper[4918]: I0319 18:05:24.832071 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-584bk/crc-debug-wvr5n"] Mar 19 18:05:24 crc kubenswrapper[4918]: I0319 18:05:24.843564 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-584bk/crc-debug-wvr5n"] Mar 19 18:05:24 crc kubenswrapper[4918]: I0319 18:05:24.874106 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9729f15b-b235-4670-9431-ec8a60b39e4d-host\") pod \"9729f15b-b235-4670-9431-ec8a60b39e4d\" (UID: \"9729f15b-b235-4670-9431-ec8a60b39e4d\") " Mar 19 18:05:24 crc kubenswrapper[4918]: I0319 18:05:24.874234 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw2zr\" (UniqueName: \"kubernetes.io/projected/9729f15b-b235-4670-9431-ec8a60b39e4d-kube-api-access-rw2zr\") pod \"9729f15b-b235-4670-9431-ec8a60b39e4d\" (UID: \"9729f15b-b235-4670-9431-ec8a60b39e4d\") " Mar 19 18:05:24 crc kubenswrapper[4918]: I0319 18:05:24.874188 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9729f15b-b235-4670-9431-ec8a60b39e4d-host" (OuterVolumeSpecName: "host") pod "9729f15b-b235-4670-9431-ec8a60b39e4d" (UID: "9729f15b-b235-4670-9431-ec8a60b39e4d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:05:24 crc kubenswrapper[4918]: I0319 18:05:24.875876 4918 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9729f15b-b235-4670-9431-ec8a60b39e4d-host\") on node \"crc\" DevicePath \"\"" Mar 19 18:05:24 crc kubenswrapper[4918]: I0319 18:05:24.879116 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9729f15b-b235-4670-9431-ec8a60b39e4d-kube-api-access-rw2zr" (OuterVolumeSpecName: "kube-api-access-rw2zr") pod "9729f15b-b235-4670-9431-ec8a60b39e4d" (UID: "9729f15b-b235-4670-9431-ec8a60b39e4d"). InnerVolumeSpecName "kube-api-access-rw2zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:05:24 crc kubenswrapper[4918]: I0319 18:05:24.977415 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw2zr\" (UniqueName: \"kubernetes.io/projected/9729f15b-b235-4670-9431-ec8a60b39e4d-kube-api-access-rw2zr\") on node \"crc\" DevicePath \"\"" Mar 19 18:05:25 crc kubenswrapper[4918]: I0319 18:05:25.665617 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11a5494fa711f0e001e4aee110e887ce33d2196f015454d12a636968f7a72043" Mar 19 18:05:25 crc kubenswrapper[4918]: I0319 18:05:25.665918 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-wvr5n" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.177271 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-584bk/crc-debug-fcsvv"] Mar 19 18:05:26 crc kubenswrapper[4918]: E0319 18:05:26.178061 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9729f15b-b235-4670-9431-ec8a60b39e4d" containerName="container-00" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.178076 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="9729f15b-b235-4670-9431-ec8a60b39e4d" containerName="container-00" Mar 19 18:05:26 crc kubenswrapper[4918]: E0319 18:05:26.178101 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerName="registry-server" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.178107 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerName="registry-server" Mar 19 18:05:26 crc kubenswrapper[4918]: E0319 18:05:26.178122 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f306c4a3-fdd1-434c-b4cc-9847c22a3e89" containerName="oc" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.178129 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="f306c4a3-fdd1-434c-b4cc-9847c22a3e89" containerName="oc" Mar 19 18:05:26 crc kubenswrapper[4918]: E0319 18:05:26.178144 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerName="extract-utilities" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.178150 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerName="extract-utilities" Mar 19 18:05:26 crc kubenswrapper[4918]: E0319 18:05:26.178161 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerName="extract-content" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.178167 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerName="extract-content" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.178359 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="19c8ba21-1d3c-448f-85cf-414b4c1aca6e" containerName="registry-server" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.178378 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="9729f15b-b235-4670-9431-ec8a60b39e4d" containerName="container-00" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.178389 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="f306c4a3-fdd1-434c-b4cc-9847c22a3e89" containerName="oc" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.179136 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-fcsvv" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.303345 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2p27\" (UniqueName: \"kubernetes.io/projected/4c351898-36b3-45bb-8877-20950a9e5dd8-kube-api-access-z2p27\") pod \"crc-debug-fcsvv\" (UID: \"4c351898-36b3-45bb-8877-20950a9e5dd8\") " pod="openshift-must-gather-584bk/crc-debug-fcsvv" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.303492 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c351898-36b3-45bb-8877-20950a9e5dd8-host\") pod \"crc-debug-fcsvv\" (UID: \"4c351898-36b3-45bb-8877-20950a9e5dd8\") " pod="openshift-must-gather-584bk/crc-debug-fcsvv" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.405135 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c351898-36b3-45bb-8877-20950a9e5dd8-host\") pod \"crc-debug-fcsvv\" (UID: \"4c351898-36b3-45bb-8877-20950a9e5dd8\") " pod="openshift-must-gather-584bk/crc-debug-fcsvv" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.405304 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2p27\" (UniqueName: \"kubernetes.io/projected/4c351898-36b3-45bb-8877-20950a9e5dd8-kube-api-access-z2p27\") pod \"crc-debug-fcsvv\" (UID: \"4c351898-36b3-45bb-8877-20950a9e5dd8\") " pod="openshift-must-gather-584bk/crc-debug-fcsvv" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.405753 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c351898-36b3-45bb-8877-20950a9e5dd8-host\") pod \"crc-debug-fcsvv\" (UID: \"4c351898-36b3-45bb-8877-20950a9e5dd8\") " pod="openshift-must-gather-584bk/crc-debug-fcsvv" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.423223 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2p27\" (UniqueName: \"kubernetes.io/projected/4c351898-36b3-45bb-8877-20950a9e5dd8-kube-api-access-z2p27\") pod \"crc-debug-fcsvv\" (UID: \"4c351898-36b3-45bb-8877-20950a9e5dd8\") " pod="openshift-must-gather-584bk/crc-debug-fcsvv" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.497430 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-fcsvv" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.600882 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9729f15b-b235-4670-9431-ec8a60b39e4d" path="/var/lib/kubelet/pods/9729f15b-b235-4670-9431-ec8a60b39e4d/volumes" Mar 19 18:05:26 crc kubenswrapper[4918]: I0319 18:05:26.673916 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-584bk/crc-debug-fcsvv" event={"ID":"4c351898-36b3-45bb-8877-20950a9e5dd8","Type":"ContainerStarted","Data":"69cd78727d94fa8f273f8ccd0b12cc452af3bc0bd2a9d5ba5edc4ace85499326"} Mar 19 18:05:27 crc kubenswrapper[4918]: I0319 18:05:27.684909 4918 generic.go:334] "Generic (PLEG): container finished" podID="4c351898-36b3-45bb-8877-20950a9e5dd8" containerID="c25afef1fd7048615aa092eda2c99efd012cd097ca8e64396a8b0a0d98bae895" exitCode=0 Mar 19 18:05:27 crc kubenswrapper[4918]: I0319 18:05:27.686437 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-584bk/crc-debug-fcsvv" event={"ID":"4c351898-36b3-45bb-8877-20950a9e5dd8","Type":"ContainerDied","Data":"c25afef1fd7048615aa092eda2c99efd012cd097ca8e64396a8b0a0d98bae895"} Mar 19 18:05:28 crc kubenswrapper[4918]: I0319 18:05:28.838648 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-fcsvv" Mar 19 18:05:28 crc kubenswrapper[4918]: I0319 18:05:28.961574 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2p27\" (UniqueName: \"kubernetes.io/projected/4c351898-36b3-45bb-8877-20950a9e5dd8-kube-api-access-z2p27\") pod \"4c351898-36b3-45bb-8877-20950a9e5dd8\" (UID: \"4c351898-36b3-45bb-8877-20950a9e5dd8\") " Mar 19 18:05:28 crc kubenswrapper[4918]: I0319 18:05:28.961697 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c351898-36b3-45bb-8877-20950a9e5dd8-host\") pod \"4c351898-36b3-45bb-8877-20950a9e5dd8\" (UID: \"4c351898-36b3-45bb-8877-20950a9e5dd8\") " Mar 19 18:05:28 crc kubenswrapper[4918]: I0319 18:05:28.962063 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c351898-36b3-45bb-8877-20950a9e5dd8-host" (OuterVolumeSpecName: "host") pod "4c351898-36b3-45bb-8877-20950a9e5dd8" (UID: "4c351898-36b3-45bb-8877-20950a9e5dd8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:05:28 crc kubenswrapper[4918]: I0319 18:05:28.962589 4918 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c351898-36b3-45bb-8877-20950a9e5dd8-host\") on node \"crc\" DevicePath \"\"" Mar 19 18:05:28 crc kubenswrapper[4918]: I0319 18:05:28.982501 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c351898-36b3-45bb-8877-20950a9e5dd8-kube-api-access-z2p27" (OuterVolumeSpecName: "kube-api-access-z2p27") pod "4c351898-36b3-45bb-8877-20950a9e5dd8" (UID: "4c351898-36b3-45bb-8877-20950a9e5dd8"). InnerVolumeSpecName "kube-api-access-z2p27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:05:29 crc kubenswrapper[4918]: I0319 18:05:29.064030 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2p27\" (UniqueName: \"kubernetes.io/projected/4c351898-36b3-45bb-8877-20950a9e5dd8-kube-api-access-z2p27\") on node \"crc\" DevicePath \"\"" Mar 19 18:05:29 crc kubenswrapper[4918]: I0319 18:05:29.355278 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-584bk/crc-debug-fcsvv"] Mar 19 18:05:29 crc kubenswrapper[4918]: I0319 18:05:29.366337 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-584bk/crc-debug-fcsvv"] Mar 19 18:05:29 crc kubenswrapper[4918]: I0319 18:05:29.710221 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69cd78727d94fa8f273f8ccd0b12cc452af3bc0bd2a9d5ba5edc4ace85499326" Mar 19 18:05:29 crc kubenswrapper[4918]: I0319 18:05:29.710278 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-fcsvv" Mar 19 18:05:30 crc kubenswrapper[4918]: I0319 18:05:30.587606 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:05:30 crc kubenswrapper[4918]: E0319 18:05:30.587855 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:05:30 crc kubenswrapper[4918]: I0319 18:05:30.603436 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c351898-36b3-45bb-8877-20950a9e5dd8" path="/var/lib/kubelet/pods/4c351898-36b3-45bb-8877-20950a9e5dd8/volumes" Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.144454 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-584bk/crc-debug-q6m2s"] Mar 19 18:05:31 crc kubenswrapper[4918]: E0319 18:05:31.145223 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c351898-36b3-45bb-8877-20950a9e5dd8" containerName="container-00" Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.145242 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c351898-36b3-45bb-8877-20950a9e5dd8" containerName="container-00" Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.145480 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c351898-36b3-45bb-8877-20950a9e5dd8" containerName="container-00" Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.146220 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-q6m2s" Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.313987 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/720bd256-070a-424a-be39-5417fdb14ded-host\") pod \"crc-debug-q6m2s\" (UID: \"720bd256-070a-424a-be39-5417fdb14ded\") " pod="openshift-must-gather-584bk/crc-debug-q6m2s" Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.314125 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbhxg\" (UniqueName: \"kubernetes.io/projected/720bd256-070a-424a-be39-5417fdb14ded-kube-api-access-vbhxg\") pod \"crc-debug-q6m2s\" (UID: \"720bd256-070a-424a-be39-5417fdb14ded\") " pod="openshift-must-gather-584bk/crc-debug-q6m2s" Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.416106 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/720bd256-070a-424a-be39-5417fdb14ded-host\") pod \"crc-debug-q6m2s\" (UID: \"720bd256-070a-424a-be39-5417fdb14ded\") " pod="openshift-must-gather-584bk/crc-debug-q6m2s" Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.416240 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbhxg\" (UniqueName: \"kubernetes.io/projected/720bd256-070a-424a-be39-5417fdb14ded-kube-api-access-vbhxg\") pod \"crc-debug-q6m2s\" (UID: \"720bd256-070a-424a-be39-5417fdb14ded\") " pod="openshift-must-gather-584bk/crc-debug-q6m2s" Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.416265 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/720bd256-070a-424a-be39-5417fdb14ded-host\") pod \"crc-debug-q6m2s\" (UID: \"720bd256-070a-424a-be39-5417fdb14ded\") " pod="openshift-must-gather-584bk/crc-debug-q6m2s" Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.442284 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbhxg\" (UniqueName: \"kubernetes.io/projected/720bd256-070a-424a-be39-5417fdb14ded-kube-api-access-vbhxg\") pod \"crc-debug-q6m2s\" (UID: \"720bd256-070a-424a-be39-5417fdb14ded\") " pod="openshift-must-gather-584bk/crc-debug-q6m2s" Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.465947 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-q6m2s" Mar 19 18:05:31 crc kubenswrapper[4918]: W0319 18:05:31.498806 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod720bd256_070a_424a_be39_5417fdb14ded.slice/crio-9a35fc075a97152b99f3877b87c39d2f2ffc11d0f46e545c563112d9760a1ccb WatchSource:0}: Error finding container 9a35fc075a97152b99f3877b87c39d2f2ffc11d0f46e545c563112d9760a1ccb: Status 404 returned error can't find the container with id 9a35fc075a97152b99f3877b87c39d2f2ffc11d0f46e545c563112d9760a1ccb Mar 19 18:05:31 crc kubenswrapper[4918]: I0319 18:05:31.738603 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-584bk/crc-debug-q6m2s" event={"ID":"720bd256-070a-424a-be39-5417fdb14ded","Type":"ContainerStarted","Data":"9a35fc075a97152b99f3877b87c39d2f2ffc11d0f46e545c563112d9760a1ccb"} Mar 19 18:05:32 crc kubenswrapper[4918]: I0319 18:05:32.755034 4918 generic.go:334] "Generic (PLEG): container finished" podID="720bd256-070a-424a-be39-5417fdb14ded" containerID="4dd53605cb6989e134fc1a8e0a3e32938ee05d6831fa3f4a0144d55717d0de62" exitCode=0 Mar 19 18:05:32 crc kubenswrapper[4918]: I0319 18:05:32.755135 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-584bk/crc-debug-q6m2s" event={"ID":"720bd256-070a-424a-be39-5417fdb14ded","Type":"ContainerDied","Data":"4dd53605cb6989e134fc1a8e0a3e32938ee05d6831fa3f4a0144d55717d0de62"} Mar 19 18:05:32 crc kubenswrapper[4918]: I0319 18:05:32.796785 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-584bk/crc-debug-q6m2s"] Mar 19 18:05:32 crc kubenswrapper[4918]: I0319 18:05:32.805568 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-584bk/crc-debug-q6m2s"] Mar 19 18:05:33 crc kubenswrapper[4918]: I0319 18:05:33.868427 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-q6m2s" Mar 19 18:05:33 crc kubenswrapper[4918]: I0319 18:05:33.976082 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/720bd256-070a-424a-be39-5417fdb14ded-host\") pod \"720bd256-070a-424a-be39-5417fdb14ded\" (UID: \"720bd256-070a-424a-be39-5417fdb14ded\") " Mar 19 18:05:33 crc kubenswrapper[4918]: I0319 18:05:33.976351 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbhxg\" (UniqueName: \"kubernetes.io/projected/720bd256-070a-424a-be39-5417fdb14ded-kube-api-access-vbhxg\") pod \"720bd256-070a-424a-be39-5417fdb14ded\" (UID: \"720bd256-070a-424a-be39-5417fdb14ded\") " Mar 19 18:05:33 crc kubenswrapper[4918]: I0319 18:05:33.978032 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/720bd256-070a-424a-be39-5417fdb14ded-host" (OuterVolumeSpecName: "host") pod "720bd256-070a-424a-be39-5417fdb14ded" (UID: "720bd256-070a-424a-be39-5417fdb14ded"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:05:33 crc kubenswrapper[4918]: I0319 18:05:33.982062 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720bd256-070a-424a-be39-5417fdb14ded-kube-api-access-vbhxg" (OuterVolumeSpecName: "kube-api-access-vbhxg") pod "720bd256-070a-424a-be39-5417fdb14ded" (UID: "720bd256-070a-424a-be39-5417fdb14ded"). InnerVolumeSpecName "kube-api-access-vbhxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:05:34 crc kubenswrapper[4918]: I0319 18:05:34.079002 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbhxg\" (UniqueName: \"kubernetes.io/projected/720bd256-070a-424a-be39-5417fdb14ded-kube-api-access-vbhxg\") on node \"crc\" DevicePath \"\"" Mar 19 18:05:34 crc kubenswrapper[4918]: I0319 18:05:34.079231 4918 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/720bd256-070a-424a-be39-5417fdb14ded-host\") on node \"crc\" DevicePath \"\"" Mar 19 18:05:34 crc kubenswrapper[4918]: I0319 18:05:34.597689 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720bd256-070a-424a-be39-5417fdb14ded" path="/var/lib/kubelet/pods/720bd256-070a-424a-be39-5417fdb14ded/volumes" Mar 19 18:05:34 crc kubenswrapper[4918]: I0319 18:05:34.772898 4918 scope.go:117] "RemoveContainer" containerID="4dd53605cb6989e134fc1a8e0a3e32938ee05d6831fa3f4a0144d55717d0de62" Mar 19 18:05:34 crc kubenswrapper[4918]: I0319 18:05:34.772950 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/crc-debug-q6m2s" Mar 19 18:05:42 crc kubenswrapper[4918]: I0319 18:05:42.593033 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:05:42 crc kubenswrapper[4918]: E0319 18:05:42.593706 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:05:54 crc kubenswrapper[4918]: I0319 18:05:54.586256 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:05:54 crc kubenswrapper[4918]: E0319 18:05:54.587863 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:06:00 crc kubenswrapper[4918]: I0319 18:06:00.152825 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565726-npqlb"] Mar 19 18:06:00 crc kubenswrapper[4918]: E0319 18:06:00.156809 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720bd256-070a-424a-be39-5417fdb14ded" containerName="container-00" Mar 19 18:06:00 crc kubenswrapper[4918]: I0319 18:06:00.156834 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="720bd256-070a-424a-be39-5417fdb14ded" containerName="container-00" Mar 19 18:06:00 crc kubenswrapper[4918]: I0319 18:06:00.157133 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="720bd256-070a-424a-be39-5417fdb14ded" containerName="container-00" Mar 19 18:06:00 crc kubenswrapper[4918]: I0319 18:06:00.157856 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565726-npqlb" Mar 19 18:06:00 crc kubenswrapper[4918]: I0319 18:06:00.165476 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:06:00 crc kubenswrapper[4918]: I0319 18:06:00.165762 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:06:00 crc kubenswrapper[4918]: I0319 18:06:00.165982 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 18:06:00 crc kubenswrapper[4918]: I0319 18:06:00.201240 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565726-npqlb"] Mar 19 18:06:00 crc kubenswrapper[4918]: I0319 18:06:00.290904 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbx9r\" (UniqueName: \"kubernetes.io/projected/a8da44d7-a874-4005-aaaa-4c8a19cc30fa-kube-api-access-dbx9r\") pod \"auto-csr-approver-29565726-npqlb\" (UID: \"a8da44d7-a874-4005-aaaa-4c8a19cc30fa\") " pod="openshift-infra/auto-csr-approver-29565726-npqlb" Mar 19 18:06:00 crc kubenswrapper[4918]: I0319 18:06:00.393173 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbx9r\" (UniqueName: \"kubernetes.io/projected/a8da44d7-a874-4005-aaaa-4c8a19cc30fa-kube-api-access-dbx9r\") pod \"auto-csr-approver-29565726-npqlb\" (UID: \"a8da44d7-a874-4005-aaaa-4c8a19cc30fa\") " pod="openshift-infra/auto-csr-approver-29565726-npqlb" Mar 19 18:06:00 crc kubenswrapper[4918]: I0319 18:06:00.796052 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbx9r\" (UniqueName: \"kubernetes.io/projected/a8da44d7-a874-4005-aaaa-4c8a19cc30fa-kube-api-access-dbx9r\") pod \"auto-csr-approver-29565726-npqlb\" (UID: \"a8da44d7-a874-4005-aaaa-4c8a19cc30fa\") " pod="openshift-infra/auto-csr-approver-29565726-npqlb" Mar 19 18:06:01 crc kubenswrapper[4918]: I0319 18:06:01.081972 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565726-npqlb" Mar 19 18:06:01 crc kubenswrapper[4918]: I0319 18:06:01.956814 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565726-npqlb"] Mar 19 18:06:01 crc kubenswrapper[4918]: I0319 18:06:01.975858 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 18:06:02 crc kubenswrapper[4918]: I0319 18:06:02.027776 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565726-npqlb" event={"ID":"a8da44d7-a874-4005-aaaa-4c8a19cc30fa","Type":"ContainerStarted","Data":"63284e2e7d451dfcc3611eba3da5412d6161061ba7c46efc1c59d09f7f9e622c"} Mar 19 18:06:05 crc kubenswrapper[4918]: I0319 18:06:05.066123 4918 generic.go:334] "Generic (PLEG): container finished" podID="a8da44d7-a874-4005-aaaa-4c8a19cc30fa" containerID="be6fa658f12f1aa21c55c400959f1e9a43844c92b20936e94e2c0a8c3a133c38" exitCode=0 Mar 19 18:06:05 crc kubenswrapper[4918]: I0319 18:06:05.066310 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565726-npqlb" event={"ID":"a8da44d7-a874-4005-aaaa-4c8a19cc30fa","Type":"ContainerDied","Data":"be6fa658f12f1aa21c55c400959f1e9a43844c92b20936e94e2c0a8c3a133c38"} Mar 19 18:06:06 crc kubenswrapper[4918]: I0319 18:06:06.588510 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:06:06 crc kubenswrapper[4918]: E0319 18:06:06.589068 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:06:07 crc kubenswrapper[4918]: I0319 18:06:07.236680 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565726-npqlb" Mar 19 18:06:07 crc kubenswrapper[4918]: I0319 18:06:07.362057 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbx9r\" (UniqueName: \"kubernetes.io/projected/a8da44d7-a874-4005-aaaa-4c8a19cc30fa-kube-api-access-dbx9r\") pod \"a8da44d7-a874-4005-aaaa-4c8a19cc30fa\" (UID: \"a8da44d7-a874-4005-aaaa-4c8a19cc30fa\") " Mar 19 18:06:07 crc kubenswrapper[4918]: I0319 18:06:07.379558 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8da44d7-a874-4005-aaaa-4c8a19cc30fa-kube-api-access-dbx9r" (OuterVolumeSpecName: "kube-api-access-dbx9r") pod "a8da44d7-a874-4005-aaaa-4c8a19cc30fa" (UID: "a8da44d7-a874-4005-aaaa-4c8a19cc30fa"). InnerVolumeSpecName "kube-api-access-dbx9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:06:07 crc kubenswrapper[4918]: I0319 18:06:07.464990 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbx9r\" (UniqueName: \"kubernetes.io/projected/a8da44d7-a874-4005-aaaa-4c8a19cc30fa-kube-api-access-dbx9r\") on node \"crc\" DevicePath \"\"" Mar 19 18:06:08 crc kubenswrapper[4918]: I0319 18:06:08.093211 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565726-npqlb" event={"ID":"a8da44d7-a874-4005-aaaa-4c8a19cc30fa","Type":"ContainerDied","Data":"63284e2e7d451dfcc3611eba3da5412d6161061ba7c46efc1c59d09f7f9e622c"} Mar 19 18:06:08 crc kubenswrapper[4918]: I0319 18:06:08.093539 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63284e2e7d451dfcc3611eba3da5412d6161061ba7c46efc1c59d09f7f9e622c" Mar 19 18:06:08 crc kubenswrapper[4918]: I0319 18:06:08.093257 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565726-npqlb" Mar 19 18:06:08 crc kubenswrapper[4918]: I0319 18:06:08.318934 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565720-4pr9h"] Mar 19 18:06:08 crc kubenswrapper[4918]: I0319 18:06:08.330136 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565720-4pr9h"] Mar 19 18:06:08 crc kubenswrapper[4918]: I0319 18:06:08.605261 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d390c826-1356-447f-b1da-72f57e7accbc" path="/var/lib/kubelet/pods/d390c826-1356-447f-b1da-72f57e7accbc/volumes" Mar 19 18:06:16 crc kubenswrapper[4918]: I0319 18:06:16.240144 4918 scope.go:117] "RemoveContainer" containerID="e76f4b19dcc4df6683b8faa6aa2471c0b84b2c8416b224f2c04e6e26abfdba89" Mar 19 18:06:20 crc kubenswrapper[4918]: I0319 18:06:20.586833 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:06:20 crc kubenswrapper[4918]: E0319 18:06:20.587726 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:06:35 crc kubenswrapper[4918]: I0319 18:06:35.587176 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:06:35 crc kubenswrapper[4918]: E0319 18:06:35.587862 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:06:40 crc kubenswrapper[4918]: I0319 18:06:40.005084 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_fd357519-ae6b-45ec-a8e1-dfc0c060be13/init-config-reloader/0.log" Mar 19 18:06:40 crc kubenswrapper[4918]: I0319 18:06:40.400186 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_fd357519-ae6b-45ec-a8e1-dfc0c060be13/alertmanager/0.log" Mar 19 18:06:40 crc kubenswrapper[4918]: I0319 18:06:40.424871 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_fd357519-ae6b-45ec-a8e1-dfc0c060be13/init-config-reloader/0.log" Mar 19 18:06:40 crc kubenswrapper[4918]: I0319 18:06:40.510230 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_fd357519-ae6b-45ec-a8e1-dfc0c060be13/config-reloader/0.log" Mar 19 18:06:40 crc kubenswrapper[4918]: I0319 18:06:40.731177 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-678fb97f86-hlhbk_af512bdc-58dc-481d-a454-821bcb84d090/barbican-api/0.log" Mar 19 18:06:40 crc kubenswrapper[4918]: I0319 18:06:40.801066 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-678fb97f86-hlhbk_af512bdc-58dc-481d-a454-821bcb84d090/barbican-api-log/0.log" Mar 19 18:06:41 crc kubenswrapper[4918]: I0319 18:06:41.296441 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5c845594d6-ndb6n_0e6ac727-0f36-47d5-a77d-e21f590089e1/barbican-keystone-listener/0.log" Mar 19 18:06:41 crc kubenswrapper[4918]: I0319 18:06:41.413448 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5c845594d6-ndb6n_0e6ac727-0f36-47d5-a77d-e21f590089e1/barbican-keystone-listener-log/0.log" Mar 19 18:06:41 crc kubenswrapper[4918]: I0319 18:06:41.437911 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6649d8ff59-m8z7j_1552f8f6-8143-42f2-882b-acead175ae14/barbican-worker/0.log" Mar 19 18:06:41 crc kubenswrapper[4918]: I0319 18:06:41.746412 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6649d8ff59-m8z7j_1552f8f6-8143-42f2-882b-acead175ae14/barbican-worker-log/0.log" Mar 19 18:06:41 crc kubenswrapper[4918]: I0319 18:06:41.792557 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4g2mm_63b801eb-41a1-4d19-933b-e098bedd9e93/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:06:42 crc kubenswrapper[4918]: I0319 18:06:42.497743 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b0189fc1-60b5-4734-a4b2-aa1714795f50/ceilometer-notification-agent/0.log" Mar 19 18:06:42 crc kubenswrapper[4918]: I0319 18:06:42.615452 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b0189fc1-60b5-4734-a4b2-aa1714795f50/proxy-httpd/0.log" Mar 19 18:06:42 crc kubenswrapper[4918]: I0319 18:06:42.650574 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b0189fc1-60b5-4734-a4b2-aa1714795f50/ceilometer-central-agent/0.log" Mar 19 18:06:42 crc kubenswrapper[4918]: I0319 18:06:42.995225 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_de4932f1-1f41-4512-9dd6-408a095de14a/cinder-api/0.log" Mar 19 18:06:43 crc kubenswrapper[4918]: I0319 18:06:43.080182 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b0189fc1-60b5-4734-a4b2-aa1714795f50/sg-core/0.log" Mar 19 18:06:43 crc kubenswrapper[4918]: I0319 18:06:43.207684 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_de4932f1-1f41-4512-9dd6-408a095de14a/cinder-api-log/0.log" Mar 19 18:06:43 crc kubenswrapper[4918]: I0319 18:06:43.427712 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c7303214-658d-4763-a3d6-cffd5025d9d4/cinder-scheduler/0.log" Mar 19 18:06:43 crc kubenswrapper[4918]: I0319 18:06:43.463016 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c7303214-658d-4763-a3d6-cffd5025d9d4/probe/0.log" Mar 19 18:06:44 crc kubenswrapper[4918]: I0319 18:06:44.135136 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_52bf6e94-a8b5-406a-a69e-39b883fa847d/cloudkitty-api/0.log" Mar 19 18:06:44 crc kubenswrapper[4918]: I0319 18:06:44.145364 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_52bf6e94-a8b5-406a-a69e-39b883fa847d/cloudkitty-api-log/0.log" Mar 19 18:06:44 crc kubenswrapper[4918]: I0319 18:06:44.609487 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-5d547bbd4d-mnzd4_6c3e0b77-c556-4efa-91ba-b27926b39aa8/loki-distributor/0.log" Mar 19 18:06:44 crc kubenswrapper[4918]: I0319 18:06:44.721179 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_70a17e2e-15ff-4992-882c-b626fc8b94b6/loki-compactor/0.log" Mar 19 18:06:45 crc kubenswrapper[4918]: I0319 18:06:45.115964 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-6b884dc4b5-55g7d_fdeccb80-0736-4fb2-b8e9-17a7317865cb/gateway/0.log" Mar 19 18:06:45 crc kubenswrapper[4918]: I0319 18:06:45.166282 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-6b884dc4b5-cmv48_6e4b521b-2c5e-466f-8c30-881de9b09a1b/gateway/0.log" Mar 19 18:06:45 crc kubenswrapper[4918]: I0319 18:06:45.601013 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_2d2b0346-1ed3-4754-9788-e4f469a558e9/loki-index-gateway/0.log" Mar 19 18:06:45 crc kubenswrapper[4918]: I0319 18:06:45.967568 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_142e9778-542e-491b-95f2-8a63e76c4271/loki-ingester/0.log" Mar 19 18:06:46 crc kubenswrapper[4918]: I0319 18:06:46.706411 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-6f54889599-ljlbj_35defcbc-2979-46e0-8f03-e1cc89f7fd86/loki-query-frontend/0.log" Mar 19 18:06:46 crc kubenswrapper[4918]: I0319 18:06:46.926809 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-668f98fdd7-xzqfn_2c1cd5b2-9500-4b75-bfd4-c99a4f8c2089/loki-querier/0.log" Mar 19 18:06:47 crc kubenswrapper[4918]: I0319 18:06:47.336001 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-n8g5h_82bf14ac-828e-41e8-987c-bb83598d73a5/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:06:47 crc kubenswrapper[4918]: I0319 18:06:47.529760 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mfw9l_8e658b37-8529-4e6f-adbc-0974b7957e57/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:06:47 crc kubenswrapper[4918]: I0319 18:06:47.587249 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:06:47 crc kubenswrapper[4918]: E0319 18:06:47.587503 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:06:47 crc kubenswrapper[4918]: I0319 18:06:47.846647 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-ktkwd_d431095e-2467-47c8-8288-ef25e31bed1e/init/0.log" Mar 19 18:06:48 crc kubenswrapper[4918]: I0319 18:06:48.488290 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-ktkwd_d431095e-2467-47c8-8288-ef25e31bed1e/dnsmasq-dns/0.log" Mar 19 18:06:48 crc kubenswrapper[4918]: I0319 18:06:48.569180 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-szqtt_04af1485-802e-4821-a499-683301ee97ff/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:06:48 crc kubenswrapper[4918]: I0319 18:06:48.613956 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-ktkwd_d431095e-2467-47c8-8288-ef25e31bed1e/init/0.log" Mar 19 18:06:48 crc kubenswrapper[4918]: I0319 18:06:48.948015 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fd884183-4c9b-4bc9-94f9-7fc63ddfd344/glance-log/0.log" Mar 19 18:06:49 crc kubenswrapper[4918]: I0319 18:06:49.232065 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fd884183-4c9b-4bc9-94f9-7fc63ddfd344/glance-httpd/0.log" Mar 19 18:06:49 crc kubenswrapper[4918]: I0319 18:06:49.453050 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_bdb2b1c6-814d-466f-a51f-07b6440ac7ea/cloudkitty-proc/0.log" Mar 19 18:06:49 crc kubenswrapper[4918]: I0319 18:06:49.683046 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_69dd59cf-aece-4caa-a2d1-85dfcc8c3306/glance-log/0.log" Mar 19 18:06:49 crc kubenswrapper[4918]: I0319 18:06:49.877199 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_69dd59cf-aece-4caa-a2d1-85dfcc8c3306/glance-httpd/0.log" Mar 19 18:06:49 crc kubenswrapper[4918]: I0319 18:06:49.940194 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gbbmx_10b0ee10-0449-4ca7-bece-7942e3bf9f86/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:06:50 crc kubenswrapper[4918]: I0319 18:06:50.151515 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-nzw8c_6a5a8faa-ca9b-4aa7-aa51-8605063466d5/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:06:50 crc kubenswrapper[4918]: I0319 18:06:50.606840 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-568c4fd78c-t5k2q_156aeae6-d08f-48d3-a43a-63edfaad7860/keystone-api/0.log" Mar 19 18:06:50 crc kubenswrapper[4918]: I0319 18:06:50.781912 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29565661-7qtw4_6d4f01d1-2728-4858-a39d-6b44e675aca5/keystone-cron/0.log" Mar 19 18:06:50 crc kubenswrapper[4918]: I0319 18:06:50.969011 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29565721-vvmng_a4a52a09-c672-45bd-b5ed-f5794ac59da9/keystone-cron/0.log" Mar 19 18:06:51 crc kubenswrapper[4918]: I0319 18:06:51.166190 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e77f7369-22cd-4f5b-afb8-132004eb811f/kube-state-metrics/0.log" Mar 19 18:06:51 crc kubenswrapper[4918]: I0319 18:06:51.538392 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9lxcx_9cc96cf8-b975-4f49-8032-bb1d31580e7b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:06:52 crc kubenswrapper[4918]: I0319 18:06:52.171152 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54486b455c-k7jwz_f2211c01-104a-4847-8c59-bc11ff34169f/neutron-httpd/0.log" Mar 19 18:06:52 crc kubenswrapper[4918]: I0319 18:06:52.204219 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54486b455c-k7jwz_f2211c01-104a-4847-8c59-bc11ff34169f/neutron-api/0.log" Mar 19 18:06:53 crc kubenswrapper[4918]: I0319 18:06:53.001485 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-8zbvt_e2200bcd-af51-40ea-aef4-e601b73f6f78/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:06:53 crc kubenswrapper[4918]: I0319 18:06:53.556567 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c14a1ce8-e827-4652-9a21-43d9cbcbac47/nova-api-log/0.log" Mar 19 18:06:53 crc kubenswrapper[4918]: I0319 18:06:53.757734 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_05a9c491-28ae-4d91-8366-7e449bbf8d8e/nova-cell0-conductor-conductor/0.log" Mar 19 18:06:53 crc kubenswrapper[4918]: I0319 18:06:53.985351 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_c14a1ce8-e827-4652-9a21-43d9cbcbac47/nova-api-api/0.log" Mar 19 18:06:54 crc kubenswrapper[4918]: I0319 18:06:54.115709 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_be298c7d-0f1b-44c9-ac1f-3e2accac7bdc/nova-cell1-conductor-conductor/0.log" Mar 19 18:06:54 crc kubenswrapper[4918]: I0319 18:06:54.437375 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f8c29b03-6e93-446c-911f-af6e2e3ca36b/nova-cell1-novncproxy-novncproxy/0.log" Mar 19 18:06:54 crc kubenswrapper[4918]: I0319 18:06:54.875184 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2491f737-beec-4148-b143-1c83527b477a/nova-metadata-log/0.log" Mar 19 18:06:55 crc kubenswrapper[4918]: I0319 18:06:55.082036 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-cbbl5_e21d159f-fdc3-48bf-b40b-5bda64316b5e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:06:55 crc kubenswrapper[4918]: I0319 18:06:55.327626 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_602c7fd6-d47b-4b05-880b-7a03afb02c49/nova-scheduler-scheduler/0.log" Mar 19 18:06:55 crc kubenswrapper[4918]: I0319 18:06:55.537614 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2491f737-beec-4148-b143-1c83527b477a/nova-metadata-metadata/0.log" Mar 19 18:06:56 crc kubenswrapper[4918]: I0319 18:06:56.092540 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31a9da2c-83a7-408e-bae2-66a7097081ff/mysql-bootstrap/0.log" Mar 19 18:06:56 crc kubenswrapper[4918]: I0319 18:06:56.098279 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31a9da2c-83a7-408e-bae2-66a7097081ff/galera/0.log" Mar 19 18:06:56 crc kubenswrapper[4918]: I0319 18:06:56.167980 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31a9da2c-83a7-408e-bae2-66a7097081ff/mysql-bootstrap/0.log" Mar 19 18:06:56 crc kubenswrapper[4918]: I0319 18:06:56.480175 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_22f16181-1900-453e-a97a-d3da7960a1cf/mysql-bootstrap/0.log" Mar 19 18:06:56 crc kubenswrapper[4918]: I0319 18:06:56.957787 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_22f16181-1900-453e-a97a-d3da7960a1cf/mysql-bootstrap/0.log" Mar 19 18:06:57 crc kubenswrapper[4918]: I0319 18:06:57.019758 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_22f16181-1900-453e-a97a-d3da7960a1cf/galera/0.log" Mar 19 18:06:57 crc kubenswrapper[4918]: I0319 18:06:57.044878 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_789c957f-feb3-4f8c-83fa-3524740a2c8d/openstackclient/0.log" Mar 19 18:06:57 crc kubenswrapper[4918]: I0319 18:06:57.287970 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-4g569_ddfeeb53-dd69-430f-9460-fa20627d4d26/ovn-controller/0.log" Mar 19 18:06:57 crc kubenswrapper[4918]: I0319 18:06:57.402102 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-q7kqm_9fa74868-5691-4d60-8d10-3e8dc1ddc776/openstack-network-exporter/0.log" Mar 19 18:06:57 crc kubenswrapper[4918]: I0319 18:06:57.733599 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kt2zs_13525212-7d91-453f-a80d-2e6a8febb21e/ovsdb-server-init/0.log" Mar 19 18:06:58 crc kubenswrapper[4918]: I0319 18:06:58.328373 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kt2zs_13525212-7d91-453f-a80d-2e6a8febb21e/ovs-vswitchd/0.log" Mar 19 18:06:58 crc kubenswrapper[4918]: I0319 18:06:58.354655 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kt2zs_13525212-7d91-453f-a80d-2e6a8febb21e/ovsdb-server-init/0.log" Mar 19 18:06:58 crc kubenswrapper[4918]: I0319 18:06:58.478078 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kt2zs_13525212-7d91-453f-a80d-2e6a8febb21e/ovsdb-server/0.log" Mar 19 18:06:58 crc kubenswrapper[4918]: I0319 18:06:58.613101 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:06:58 crc kubenswrapper[4918]: E0319 18:06:58.613335 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:06:58 crc kubenswrapper[4918]: I0319 18:06:58.869921 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-4smlj_28c0e0ec-e1b6-4937-89dc-f09d42d97bd3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:06:58 crc kubenswrapper[4918]: I0319 18:06:58.873873 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e2b5f782-92ae-4f60-8d61-198e3008e01c/openstack-network-exporter/0.log" Mar 19 18:06:59 crc kubenswrapper[4918]: I0319 18:06:59.453997 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5e1232f6-b41e-443e-b96e-e38929f077d4/openstack-network-exporter/0.log" Mar 19 18:06:59 crc kubenswrapper[4918]: I0319 18:06:59.483864 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e2b5f782-92ae-4f60-8d61-198e3008e01c/ovn-northd/0.log" Mar 19 18:06:59 crc kubenswrapper[4918]: I0319 18:06:59.884135 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5e1232f6-b41e-443e-b96e-e38929f077d4/ovsdbserver-nb/0.log" Mar 19 18:07:00 crc kubenswrapper[4918]: I0319 18:07:00.131024 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fc6b3398-7f5f-4485-9826-fbb92f8f26e2/openstack-network-exporter/0.log" Mar 19 18:07:00 crc kubenswrapper[4918]: I0319 18:07:00.295015 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fc6b3398-7f5f-4485-9826-fbb92f8f26e2/ovsdbserver-sb/0.log" Mar 19 18:07:01 crc kubenswrapper[4918]: I0319 18:07:01.530028 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-76f5474f44-brjsr_79a5f53b-d94a-405f-9ead-9d519f30a3dc/placement-api/0.log" Mar 19 18:07:01 crc kubenswrapper[4918]: I0319 18:07:01.891803 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-76f5474f44-brjsr_79a5f53b-d94a-405f-9ead-9d519f30a3dc/placement-log/0.log" Mar 19 18:07:01 crc kubenswrapper[4918]: I0319 18:07:01.933183 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ce548ee3-59a9-46f9-8b00-06d380b17566/init-config-reloader/0.log" Mar 19 18:07:02 crc kubenswrapper[4918]: I0319 18:07:02.263637 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ce548ee3-59a9-46f9-8b00-06d380b17566/init-config-reloader/0.log" Mar 19 18:07:02 crc kubenswrapper[4918]: I0319 18:07:02.389100 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ce548ee3-59a9-46f9-8b00-06d380b17566/thanos-sidecar/0.log" Mar 19 18:07:02 crc kubenswrapper[4918]: I0319 18:07:02.402750 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ce548ee3-59a9-46f9-8b00-06d380b17566/config-reloader/0.log" Mar 19 18:07:02 crc kubenswrapper[4918]: I0319 18:07:02.484160 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ce548ee3-59a9-46f9-8b00-06d380b17566/prometheus/0.log" Mar 19 18:07:02 crc kubenswrapper[4918]: I0319 18:07:02.881748 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_025d722c-5115-4aae-bebd-3942f7da690d/setup-container/0.log" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.191946 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_025d722c-5115-4aae-bebd-3942f7da690d/rabbitmq/0.log" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.260649 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mdfws"] Mar 19 18:07:03 crc kubenswrapper[4918]: E0319 18:07:03.261050 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8da44d7-a874-4005-aaaa-4c8a19cc30fa" containerName="oc" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.261066 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8da44d7-a874-4005-aaaa-4c8a19cc30fa" containerName="oc" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.261371 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8da44d7-a874-4005-aaaa-4c8a19cc30fa" containerName="oc" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.264013 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.314662 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mdfws"] Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.403725 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-utilities\") pod \"community-operators-mdfws\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.403787 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cksq4\" (UniqueName: \"kubernetes.io/projected/89dcb70d-1e51-4367-9470-5b6701016baf-kube-api-access-cksq4\") pod \"community-operators-mdfws\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.403858 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-catalog-content\") pod \"community-operators-mdfws\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.404225 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5cf3eb1c-8f65-4460-8283-dcdbe5d51e50/setup-container/0.log" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.419791 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_025d722c-5115-4aae-bebd-3942f7da690d/setup-container/0.log" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.505427 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-utilities\") pod \"community-operators-mdfws\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.505672 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cksq4\" (UniqueName: \"kubernetes.io/projected/89dcb70d-1e51-4367-9470-5b6701016baf-kube-api-access-cksq4\") pod \"community-operators-mdfws\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.505819 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-catalog-content\") pod \"community-operators-mdfws\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.506120 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-utilities\") pod \"community-operators-mdfws\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.507942 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-catalog-content\") pod \"community-operators-mdfws\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.527790 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cksq4\" (UniqueName: \"kubernetes.io/projected/89dcb70d-1e51-4367-9470-5b6701016baf-kube-api-access-cksq4\") pod \"community-operators-mdfws\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:03 crc kubenswrapper[4918]: I0319 18:07:03.597596 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:04 crc kubenswrapper[4918]: I0319 18:07:04.002905 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5cf3eb1c-8f65-4460-8283-dcdbe5d51e50/setup-container/0.log" Mar 19 18:07:04 crc kubenswrapper[4918]: I0319 18:07:04.128879 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5cf3eb1c-8f65-4460-8283-dcdbe5d51e50/rabbitmq/0.log" Mar 19 18:07:04 crc kubenswrapper[4918]: I0319 18:07:04.409635 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fsrpw_81bba2ec-df16-4e0c-8afd-8e3e872f17fa/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:07:04 crc kubenswrapper[4918]: I0319 18:07:04.650158 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mdfws"] Mar 19 18:07:04 crc kubenswrapper[4918]: I0319 18:07:04.718983 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdfws" event={"ID":"89dcb70d-1e51-4367-9470-5b6701016baf","Type":"ContainerStarted","Data":"50bf1f569c2315f9a38ad6fcda8d807ace0ecd14b689cb5eda54c2d1b1f2d0d7"} Mar 19 18:07:04 crc kubenswrapper[4918]: I0319 18:07:04.792966 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-bkjnx_d97e8bcc-4a1e-45e6-88ef-3013c02d37a7/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:07:04 crc kubenswrapper[4918]: I0319 18:07:04.957633 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ppjsg_0a3d3e76-de48-44f2-a34e-021196a21f5b/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:07:05 crc kubenswrapper[4918]: I0319 18:07:05.254381 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-pcjkk_8ce6590d-02e8-4f83-aa5a-0f328daf1c1e/ssh-known-hosts-edpm-deployment/0.log" Mar 19 18:07:05 crc kubenswrapper[4918]: I0319 18:07:05.365978 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gn2zh_9a03a82a-25da-4509-a94d-26b2c686f8f3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:07:05 crc kubenswrapper[4918]: I0319 18:07:05.731875 4918 generic.go:334] "Generic (PLEG): container finished" podID="89dcb70d-1e51-4367-9470-5b6701016baf" containerID="0698cea653d581e8dc4db0f591a377f6ddbf3c6db5cf0b4fdd8de9586926d999" exitCode=0 Mar 19 18:07:05 crc kubenswrapper[4918]: I0319 18:07:05.731916 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdfws" event={"ID":"89dcb70d-1e51-4367-9470-5b6701016baf","Type":"ContainerDied","Data":"0698cea653d581e8dc4db0f591a377f6ddbf3c6db5cf0b4fdd8de9586926d999"} Mar 19 18:07:05 crc kubenswrapper[4918]: I0319 18:07:05.810773 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-559c86bbbc-4zd54_867ef803-7e55-4f6e-83ff-94f3534387a9/proxy-server/0.log" Mar 19 18:07:05 crc kubenswrapper[4918]: I0319 18:07:05.846412 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-559c86bbbc-4zd54_867ef803-7e55-4f6e-83ff-94f3534387a9/proxy-httpd/0.log" Mar 19 18:07:06 crc kubenswrapper[4918]: I0319 18:07:06.021661 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-nmm4p_7b21f92d-7895-46c0-a66d-9e0aedb15e72/swift-ring-rebalance/0.log" Mar 19 18:07:06 crc kubenswrapper[4918]: I0319 18:07:06.218972 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/account-auditor/0.log" Mar 19 18:07:06 crc kubenswrapper[4918]: I0319 18:07:06.294843 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d0eab88c-d33a-4032-b2f7-f2a355157d81/memcached/0.log" Mar 19 18:07:06 crc kubenswrapper[4918]: I0319 18:07:06.358699 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/account-reaper/0.log" Mar 19 18:07:06 crc kubenswrapper[4918]: I0319 18:07:06.411392 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/account-replicator/0.log" Mar 19 18:07:06 crc kubenswrapper[4918]: I0319 18:07:06.710839 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/account-server/0.log" Mar 19 18:07:06 crc kubenswrapper[4918]: I0319 18:07:06.789844 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/container-auditor/0.log" Mar 19 18:07:06 crc kubenswrapper[4918]: I0319 18:07:06.810084 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/container-server/0.log" Mar 19 18:07:06 crc kubenswrapper[4918]: I0319 18:07:06.911637 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/container-replicator/0.log" Mar 19 18:07:07 crc kubenswrapper[4918]: I0319 18:07:07.750311 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdfws" event={"ID":"89dcb70d-1e51-4367-9470-5b6701016baf","Type":"ContainerStarted","Data":"0b1b3e76b7550a3ec795c968191075cfa53a4a9a4e5154c3391eb14494f47541"} Mar 19 18:07:07 crc kubenswrapper[4918]: I0319 18:07:07.872562 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/object-replicator/0.log" Mar 19 18:07:07 crc kubenswrapper[4918]: I0319 18:07:07.884216 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/object-auditor/0.log" Mar 19 18:07:07 crc kubenswrapper[4918]: I0319 18:07:07.906672 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/object-expirer/0.log" Mar 19 18:07:08 crc kubenswrapper[4918]: I0319 18:07:08.010464 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/container-updater/0.log" Mar 19 18:07:08 crc kubenswrapper[4918]: I0319 18:07:08.121197 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/object-server/0.log" Mar 19 18:07:08 crc kubenswrapper[4918]: I0319 18:07:08.204028 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/rsync/0.log" Mar 19 18:07:08 crc kubenswrapper[4918]: I0319 18:07:08.356512 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/object-updater/0.log" Mar 19 18:07:08 crc kubenswrapper[4918]: I0319 18:07:08.379153 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e4118384-38ad-465d-a81e-62bf39cc6cec/swift-recon-cron/0.log" Mar 19 18:07:08 crc kubenswrapper[4918]: I0319 18:07:08.799464 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-mpczc_7aa831f4-171a-406e-b49f-eb422fb34edc/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:07:09 crc kubenswrapper[4918]: I0319 18:07:09.296560 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0548bf8b-39fd-4ee4-95d7-d454a5269a39/test-operator-logs-container/0.log" Mar 19 18:07:09 crc kubenswrapper[4918]: I0319 18:07:09.449462 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_26f468b4-9955-436c-810a-cff9e17a1063/tempest-tests-tempest-tests-runner/0.log" Mar 19 18:07:09 crc kubenswrapper[4918]: I0319 18:07:09.587331 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:07:09 crc kubenswrapper[4918]: E0319 18:07:09.588017 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:07:09 crc kubenswrapper[4918]: I0319 18:07:09.610649 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dr9cw_e3c5768a-8c3c-4ba5-a6ee-3c4df5f5d0b6/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 18:07:09 crc kubenswrapper[4918]: I0319 18:07:09.769230 4918 generic.go:334] "Generic (PLEG): container finished" podID="89dcb70d-1e51-4367-9470-5b6701016baf" containerID="0b1b3e76b7550a3ec795c968191075cfa53a4a9a4e5154c3391eb14494f47541" exitCode=0 Mar 19 18:07:09 crc kubenswrapper[4918]: I0319 18:07:09.769268 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdfws" event={"ID":"89dcb70d-1e51-4367-9470-5b6701016baf","Type":"ContainerDied","Data":"0b1b3e76b7550a3ec795c968191075cfa53a4a9a4e5154c3391eb14494f47541"} Mar 19 18:07:10 crc kubenswrapper[4918]: I0319 18:07:10.780706 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdfws" event={"ID":"89dcb70d-1e51-4367-9470-5b6701016baf","Type":"ContainerStarted","Data":"b1406b2ee9ca26ddbb8d8a6b624ba5fe1f37fa0839e88439ee147c68b5f37693"} Mar 19 18:07:10 crc kubenswrapper[4918]: I0319 18:07:10.801281 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mdfws" podStartSLOduration=3.349397626 podStartE2EDuration="7.801265321s" podCreationTimestamp="2026-03-19 18:07:03 +0000 UTC" firstStartedPulling="2026-03-19 18:07:05.733947066 +0000 UTC m=+5237.856146314" lastFinishedPulling="2026-03-19 18:07:10.185814761 +0000 UTC m=+5242.308014009" observedRunningTime="2026-03-19 18:07:10.795914304 +0000 UTC m=+5242.918113552" watchObservedRunningTime="2026-03-19 18:07:10.801265321 +0000 UTC m=+5242.923464569" Mar 19 18:07:13 crc kubenswrapper[4918]: I0319 18:07:13.597890 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:13 crc kubenswrapper[4918]: I0319 18:07:13.598346 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:13 crc kubenswrapper[4918]: I0319 18:07:13.652036 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:22 crc kubenswrapper[4918]: I0319 18:07:22.586776 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:07:22 crc kubenswrapper[4918]: E0319 18:07:22.587563 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.149636 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5hmzf"] Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.152089 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.162931 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hmzf"] Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.310750 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dtx4\" (UniqueName: \"kubernetes.io/projected/febc21de-26ff-447f-b57c-e7b4e1f9c45c-kube-api-access-9dtx4\") pod \"redhat-marketplace-5hmzf\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.310848 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-utilities\") pod \"redhat-marketplace-5hmzf\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.310991 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-catalog-content\") pod \"redhat-marketplace-5hmzf\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.412921 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-catalog-content\") pod \"redhat-marketplace-5hmzf\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.413250 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dtx4\" (UniqueName: \"kubernetes.io/projected/febc21de-26ff-447f-b57c-e7b4e1f9c45c-kube-api-access-9dtx4\") pod \"redhat-marketplace-5hmzf\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.413429 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-utilities\") pod \"redhat-marketplace-5hmzf\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.413491 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-catalog-content\") pod \"redhat-marketplace-5hmzf\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.414023 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-utilities\") pod \"redhat-marketplace-5hmzf\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.443260 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dtx4\" (UniqueName: \"kubernetes.io/projected/febc21de-26ff-447f-b57c-e7b4e1f9c45c-kube-api-access-9dtx4\") pod \"redhat-marketplace-5hmzf\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.485053 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:23 crc kubenswrapper[4918]: I0319 18:07:23.670007 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:24 crc kubenswrapper[4918]: I0319 18:07:24.396005 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hmzf"] Mar 19 18:07:24 crc kubenswrapper[4918]: I0319 18:07:24.981982 4918 generic.go:334] "Generic (PLEG): container finished" podID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" containerID="19ff65c4f6f8fdfe6c325b6bdf8cc12a7106a239c1a1a8ace923481864ce936a" exitCode=0 Mar 19 18:07:24 crc kubenswrapper[4918]: I0319 18:07:24.982203 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hmzf" event={"ID":"febc21de-26ff-447f-b57c-e7b4e1f9c45c","Type":"ContainerDied","Data":"19ff65c4f6f8fdfe6c325b6bdf8cc12a7106a239c1a1a8ace923481864ce936a"} Mar 19 18:07:24 crc kubenswrapper[4918]: I0319 18:07:24.982280 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hmzf" event={"ID":"febc21de-26ff-447f-b57c-e7b4e1f9c45c","Type":"ContainerStarted","Data":"befe5f551531f0308073d9fec595c727fe719ad5121b1df1d223c85edad16a0d"} Mar 19 18:07:25 crc kubenswrapper[4918]: I0319 18:07:25.927107 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mdfws"] Mar 19 18:07:25 crc kubenswrapper[4918]: I0319 18:07:25.927332 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mdfws" podUID="89dcb70d-1e51-4367-9470-5b6701016baf" containerName="registry-server" containerID="cri-o://b1406b2ee9ca26ddbb8d8a6b624ba5fe1f37fa0839e88439ee147c68b5f37693" gracePeriod=2 Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.012559 4918 generic.go:334] "Generic (PLEG): container finished" podID="89dcb70d-1e51-4367-9470-5b6701016baf" containerID="b1406b2ee9ca26ddbb8d8a6b624ba5fe1f37fa0839e88439ee147c68b5f37693" exitCode=0 Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.012663 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdfws" event={"ID":"89dcb70d-1e51-4367-9470-5b6701016baf","Type":"ContainerDied","Data":"b1406b2ee9ca26ddbb8d8a6b624ba5fe1f37fa0839e88439ee147c68b5f37693"} Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.015706 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hmzf" event={"ID":"febc21de-26ff-447f-b57c-e7b4e1f9c45c","Type":"ContainerStarted","Data":"1ba7208f69d3456c87b0d7724274d73bd8d79c36ac75213aafa580d590ba2350"} Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.173011 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.317164 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-utilities\") pod \"89dcb70d-1e51-4367-9470-5b6701016baf\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.317290 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-catalog-content\") pod \"89dcb70d-1e51-4367-9470-5b6701016baf\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.317347 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cksq4\" (UniqueName: \"kubernetes.io/projected/89dcb70d-1e51-4367-9470-5b6701016baf-kube-api-access-cksq4\") pod \"89dcb70d-1e51-4367-9470-5b6701016baf\" (UID: \"89dcb70d-1e51-4367-9470-5b6701016baf\") " Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.319458 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-utilities" (OuterVolumeSpecName: "utilities") pod "89dcb70d-1e51-4367-9470-5b6701016baf" (UID: "89dcb70d-1e51-4367-9470-5b6701016baf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.330591 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89dcb70d-1e51-4367-9470-5b6701016baf-kube-api-access-cksq4" (OuterVolumeSpecName: "kube-api-access-cksq4") pod "89dcb70d-1e51-4367-9470-5b6701016baf" (UID: "89dcb70d-1e51-4367-9470-5b6701016baf"). InnerVolumeSpecName "kube-api-access-cksq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.373121 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89dcb70d-1e51-4367-9470-5b6701016baf" (UID: "89dcb70d-1e51-4367-9470-5b6701016baf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.419727 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.419761 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dcb70d-1e51-4367-9470-5b6701016baf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:07:27 crc kubenswrapper[4918]: I0319 18:07:27.419773 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cksq4\" (UniqueName: \"kubernetes.io/projected/89dcb70d-1e51-4367-9470-5b6701016baf-kube-api-access-cksq4\") on node \"crc\" DevicePath \"\"" Mar 19 18:07:28 crc kubenswrapper[4918]: I0319 18:07:28.024331 4918 generic.go:334] "Generic (PLEG): container finished" podID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" containerID="1ba7208f69d3456c87b0d7724274d73bd8d79c36ac75213aafa580d590ba2350" exitCode=0 Mar 19 18:07:28 crc kubenswrapper[4918]: I0319 18:07:28.024648 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hmzf" event={"ID":"febc21de-26ff-447f-b57c-e7b4e1f9c45c","Type":"ContainerDied","Data":"1ba7208f69d3456c87b0d7724274d73bd8d79c36ac75213aafa580d590ba2350"} Mar 19 18:07:28 crc kubenswrapper[4918]: I0319 18:07:28.027497 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdfws" event={"ID":"89dcb70d-1e51-4367-9470-5b6701016baf","Type":"ContainerDied","Data":"50bf1f569c2315f9a38ad6fcda8d807ace0ecd14b689cb5eda54c2d1b1f2d0d7"} Mar 19 18:07:28 crc kubenswrapper[4918]: I0319 18:07:28.027557 4918 scope.go:117] "RemoveContainer" containerID="b1406b2ee9ca26ddbb8d8a6b624ba5fe1f37fa0839e88439ee147c68b5f37693" Mar 19 18:07:28 crc kubenswrapper[4918]: I0319 18:07:28.027578 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mdfws" Mar 19 18:07:28 crc kubenswrapper[4918]: I0319 18:07:28.050865 4918 scope.go:117] "RemoveContainer" containerID="0b1b3e76b7550a3ec795c968191075cfa53a4a9a4e5154c3391eb14494f47541" Mar 19 18:07:28 crc kubenswrapper[4918]: I0319 18:07:28.078291 4918 scope.go:117] "RemoveContainer" containerID="0698cea653d581e8dc4db0f591a377f6ddbf3c6db5cf0b4fdd8de9586926d999" Mar 19 18:07:28 crc kubenswrapper[4918]: I0319 18:07:28.093581 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mdfws"] Mar 19 18:07:28 crc kubenswrapper[4918]: I0319 18:07:28.108666 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mdfws"] Mar 19 18:07:28 crc kubenswrapper[4918]: I0319 18:07:28.597096 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89dcb70d-1e51-4367-9470-5b6701016baf" path="/var/lib/kubelet/pods/89dcb70d-1e51-4367-9470-5b6701016baf/volumes" Mar 19 18:07:29 crc kubenswrapper[4918]: I0319 18:07:29.040940 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hmzf" event={"ID":"febc21de-26ff-447f-b57c-e7b4e1f9c45c","Type":"ContainerStarted","Data":"42eb32fc37b2aedc55100482d5efdb542b9225df35654a797c8989582cd9201a"} Mar 19 18:07:29 crc kubenswrapper[4918]: I0319 18:07:29.074892 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5hmzf" podStartSLOduration=2.545760246 podStartE2EDuration="6.074872502s" podCreationTimestamp="2026-03-19 18:07:23 +0000 UTC" firstStartedPulling="2026-03-19 18:07:24.984835099 +0000 UTC m=+5257.107034347" lastFinishedPulling="2026-03-19 18:07:28.513947365 +0000 UTC m=+5260.636146603" observedRunningTime="2026-03-19 18:07:29.060626032 +0000 UTC m=+5261.182825280" watchObservedRunningTime="2026-03-19 18:07:29.074872502 +0000 UTC m=+5261.197071750" Mar 19 18:07:33 crc kubenswrapper[4918]: I0319 18:07:33.485317 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:33 crc kubenswrapper[4918]: I0319 18:07:33.487293 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:33 crc kubenswrapper[4918]: I0319 18:07:33.538639 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:33 crc kubenswrapper[4918]: I0319 18:07:33.587271 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:07:33 crc kubenswrapper[4918]: E0319 18:07:33.587726 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:07:34 crc kubenswrapper[4918]: I0319 18:07:34.172301 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:34 crc kubenswrapper[4918]: I0319 18:07:34.726243 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hmzf"] Mar 19 18:07:36 crc kubenswrapper[4918]: I0319 18:07:36.111360 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5hmzf" podUID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" containerName="registry-server" containerID="cri-o://42eb32fc37b2aedc55100482d5efdb542b9225df35654a797c8989582cd9201a" gracePeriod=2 Mar 19 18:07:37 crc kubenswrapper[4918]: I0319 18:07:37.123650 4918 generic.go:334] "Generic (PLEG): container finished" podID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" containerID="42eb32fc37b2aedc55100482d5efdb542b9225df35654a797c8989582cd9201a" exitCode=0 Mar 19 18:07:37 crc kubenswrapper[4918]: I0319 18:07:37.123807 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hmzf" event={"ID":"febc21de-26ff-447f-b57c-e7b4e1f9c45c","Type":"ContainerDied","Data":"42eb32fc37b2aedc55100482d5efdb542b9225df35654a797c8989582cd9201a"} Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.005359 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.133688 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hmzf" event={"ID":"febc21de-26ff-447f-b57c-e7b4e1f9c45c","Type":"ContainerDied","Data":"befe5f551531f0308073d9fec595c727fe719ad5121b1df1d223c85edad16a0d"} Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.133747 4918 scope.go:117] "RemoveContainer" containerID="42eb32fc37b2aedc55100482d5efdb542b9225df35654a797c8989582cd9201a" Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.133920 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hmzf" Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.142305 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-catalog-content\") pod \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.143896 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-utilities\") pod \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.143994 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dtx4\" (UniqueName: \"kubernetes.io/projected/febc21de-26ff-447f-b57c-e7b4e1f9c45c-kube-api-access-9dtx4\") pod \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\" (UID: \"febc21de-26ff-447f-b57c-e7b4e1f9c45c\") " Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.145410 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-utilities" (OuterVolumeSpecName: "utilities") pod "febc21de-26ff-447f-b57c-e7b4e1f9c45c" (UID: "febc21de-26ff-447f-b57c-e7b4e1f9c45c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.173976 4918 scope.go:117] "RemoveContainer" containerID="1ba7208f69d3456c87b0d7724274d73bd8d79c36ac75213aafa580d590ba2350" Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.195240 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/febc21de-26ff-447f-b57c-e7b4e1f9c45c-kube-api-access-9dtx4" (OuterVolumeSpecName: "kube-api-access-9dtx4") pod "febc21de-26ff-447f-b57c-e7b4e1f9c45c" (UID: "febc21de-26ff-447f-b57c-e7b4e1f9c45c"). InnerVolumeSpecName "kube-api-access-9dtx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.212810 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "febc21de-26ff-447f-b57c-e7b4e1f9c45c" (UID: "febc21de-26ff-447f-b57c-e7b4e1f9c45c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.237050 4918 scope.go:117] "RemoveContainer" containerID="19ff65c4f6f8fdfe6c325b6bdf8cc12a7106a239c1a1a8ace923481864ce936a" Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.251034 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.251278 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dtx4\" (UniqueName: \"kubernetes.io/projected/febc21de-26ff-447f-b57c-e7b4e1f9c45c-kube-api-access-9dtx4\") on node \"crc\" DevicePath \"\"" Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.251290 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/febc21de-26ff-447f-b57c-e7b4e1f9c45c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.471089 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hmzf"] Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.481902 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hmzf"] Mar 19 18:07:38 crc kubenswrapper[4918]: I0319 18:07:38.598548 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" path="/var/lib/kubelet/pods/febc21de-26ff-447f-b57c-e7b4e1f9c45c/volumes" Mar 19 18:07:47 crc kubenswrapper[4918]: I0319 18:07:47.586758 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:07:47 crc kubenswrapper[4918]: E0319 18:07:47.587506 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.147437 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565728-4mb6s"] Mar 19 18:08:00 crc kubenswrapper[4918]: E0319 18:08:00.148226 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dcb70d-1e51-4367-9470-5b6701016baf" containerName="registry-server" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.148237 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dcb70d-1e51-4367-9470-5b6701016baf" containerName="registry-server" Mar 19 18:08:00 crc kubenswrapper[4918]: E0319 18:08:00.148246 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dcb70d-1e51-4367-9470-5b6701016baf" containerName="extract-content" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.148252 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dcb70d-1e51-4367-9470-5b6701016baf" containerName="extract-content" Mar 19 18:08:00 crc kubenswrapper[4918]: E0319 18:08:00.148266 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" containerName="extract-utilities" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.148272 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" containerName="extract-utilities" Mar 19 18:08:00 crc kubenswrapper[4918]: E0319 18:08:00.148287 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" containerName="extract-content" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.148292 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" containerName="extract-content" Mar 19 18:08:00 crc kubenswrapper[4918]: E0319 18:08:00.148303 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" containerName="registry-server" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.148309 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" containerName="registry-server" Mar 19 18:08:00 crc kubenswrapper[4918]: E0319 18:08:00.148333 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dcb70d-1e51-4367-9470-5b6701016baf" containerName="extract-utilities" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.148339 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dcb70d-1e51-4367-9470-5b6701016baf" containerName="extract-utilities" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.148505 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="89dcb70d-1e51-4367-9470-5b6701016baf" containerName="registry-server" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.148538 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="febc21de-26ff-447f-b57c-e7b4e1f9c45c" containerName="registry-server" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.149219 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565728-4mb6s" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.157257 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.157680 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.159605 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.194157 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565728-4mb6s"] Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.305379 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxmsb\" (UniqueName: \"kubernetes.io/projected/7f160e5d-a89e-484d-b11c-7095a366b452-kube-api-access-rxmsb\") pod \"auto-csr-approver-29565728-4mb6s\" (UID: \"7f160e5d-a89e-484d-b11c-7095a366b452\") " pod="openshift-infra/auto-csr-approver-29565728-4mb6s" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.406970 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxmsb\" (UniqueName: \"kubernetes.io/projected/7f160e5d-a89e-484d-b11c-7095a366b452-kube-api-access-rxmsb\") pod \"auto-csr-approver-29565728-4mb6s\" (UID: \"7f160e5d-a89e-484d-b11c-7095a366b452\") " pod="openshift-infra/auto-csr-approver-29565728-4mb6s" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.435359 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxmsb\" (UniqueName: \"kubernetes.io/projected/7f160e5d-a89e-484d-b11c-7095a366b452-kube-api-access-rxmsb\") pod \"auto-csr-approver-29565728-4mb6s\" (UID: \"7f160e5d-a89e-484d-b11c-7095a366b452\") " pod="openshift-infra/auto-csr-approver-29565728-4mb6s" Mar 19 18:08:00 crc kubenswrapper[4918]: I0319 18:08:00.470107 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565728-4mb6s" Mar 19 18:08:01 crc kubenswrapper[4918]: I0319 18:08:01.521105 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565728-4mb6s"] Mar 19 18:08:01 crc kubenswrapper[4918]: W0319 18:08:01.604822 4918 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f160e5d_a89e_484d_b11c_7095a366b452.slice/crio-aed9c3444477798ba2bbc3545fcbea1cd3641c3f585febc8c978e5be7dd9dbd5 WatchSource:0}: Error finding container aed9c3444477798ba2bbc3545fcbea1cd3641c3f585febc8c978e5be7dd9dbd5: Status 404 returned error can't find the container with id aed9c3444477798ba2bbc3545fcbea1cd3641c3f585febc8c978e5be7dd9dbd5 Mar 19 18:08:02 crc kubenswrapper[4918]: I0319 18:08:02.586982 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:08:02 crc kubenswrapper[4918]: E0319 18:08:02.609258 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:08:02 crc kubenswrapper[4918]: I0319 18:08:02.652619 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565728-4mb6s" event={"ID":"7f160e5d-a89e-484d-b11c-7095a366b452","Type":"ContainerStarted","Data":"aed9c3444477798ba2bbc3545fcbea1cd3641c3f585febc8c978e5be7dd9dbd5"} Mar 19 18:08:05 crc kubenswrapper[4918]: I0319 18:08:05.718651 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w_e13a0a8b-c7bc-4065-9a26-86034a00f0ac/util/0.log" Mar 19 18:08:06 crc kubenswrapper[4918]: I0319 18:08:06.218647 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w_e13a0a8b-c7bc-4065-9a26-86034a00f0ac/util/0.log" Mar 19 18:08:06 crc kubenswrapper[4918]: I0319 18:08:06.244192 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w_e13a0a8b-c7bc-4065-9a26-86034a00f0ac/pull/0.log" Mar 19 18:08:06 crc kubenswrapper[4918]: I0319 18:08:06.424429 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w_e13a0a8b-c7bc-4065-9a26-86034a00f0ac/pull/0.log" Mar 19 18:08:06 crc kubenswrapper[4918]: I0319 18:08:06.744413 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w_e13a0a8b-c7bc-4065-9a26-86034a00f0ac/pull/0.log" Mar 19 18:08:06 crc kubenswrapper[4918]: I0319 18:08:06.794486 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w_e13a0a8b-c7bc-4065-9a26-86034a00f0ac/util/0.log" Mar 19 18:08:06 crc kubenswrapper[4918]: I0319 18:08:06.854359 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1f91b0eb7ee7a9999c149511f0710d33dfe6ba4b4ba3bfc4ac72f12c14dp59w_e13a0a8b-c7bc-4065-9a26-86034a00f0ac/extract/0.log" Mar 19 18:08:07 crc kubenswrapper[4918]: I0319 18:08:07.353253 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-wg98k_4ed700e4-0a35-4c6c-b57a-cde49d5f816c/manager/0.log" Mar 19 18:08:07 crc kubenswrapper[4918]: I0319 18:08:07.674070 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-w78t5_e9f74e44-e78b-4b23-b409-89af31c2dc82/manager/0.log" Mar 19 18:08:08 crc kubenswrapper[4918]: I0319 18:08:08.107485 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-ql64p_cf3083b2-86ce-4d01-97c5-9005f683ff62/manager/0.log" Mar 19 18:08:08 crc kubenswrapper[4918]: I0319 18:08:08.480495 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-k2f4b_e8c75c9e-0913-485c-a5fe-9c9bf6e4bc53/manager/0.log" Mar 19 18:08:08 crc kubenswrapper[4918]: I0319 18:08:08.556089 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-xl6zs_2644d9c5-c386-4d63-9cf9-7517f4fd6cb0/manager/0.log" Mar 19 18:08:08 crc kubenswrapper[4918]: I0319 18:08:08.966826 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-nmkqj_b059bb35-4870-4982-b62f-e70ffd0270d2/manager/0.log" Mar 19 18:08:09 crc kubenswrapper[4918]: I0319 18:08:09.420666 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-8tqxl_64edd3c9-61ea-4fc0-9a74-95a27c4bffc9/manager/0.log" Mar 19 18:08:09 crc kubenswrapper[4918]: I0319 18:08:09.523188 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-msx8p_3d624150-7673-43db-b503-ec532c7c00ca/manager/0.log" Mar 19 18:08:09 crc kubenswrapper[4918]: I0319 18:08:09.703847 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-8b6nq_b8633109-a56d-4535-b603-f75c257cb093/manager/0.log" Mar 19 18:08:09 crc kubenswrapper[4918]: I0319 18:08:09.849404 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-sgwx5_34d2b03c-e63e-425a-a48a-6c9c97508add/manager/0.log" Mar 19 18:08:10 crc kubenswrapper[4918]: I0319 18:08:10.205420 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-5dbck_c2eafc06-6df4-440d-820f-aad17b6061d7/manager/0.log" Mar 19 18:08:10 crc kubenswrapper[4918]: I0319 18:08:10.258769 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-rkhtw_230477fa-ce49-4e4d-a0a0-5bf2538c5192/manager/0.log" Mar 19 18:08:10 crc kubenswrapper[4918]: I0319 18:08:10.574851 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-26r58_7be38652-3021-4349-be08-4759ee13141b/manager/0.log" Mar 19 18:08:10 crc kubenswrapper[4918]: I0319 18:08:10.595776 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-w7rjd_d9f56510-e25f-4de5-85b6-3030e989d13d/manager/0.log" Mar 19 18:08:10 crc kubenswrapper[4918]: I0319 18:08:10.906424 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-c98pr_d06a3c13-6323-4fef-9aec-101be98e242b/manager/0.log" Mar 19 18:08:11 crc kubenswrapper[4918]: I0319 18:08:11.084329 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7658474f4d-mb9mz_13ca5024-984f-45b0-8235-f41f91664ef9/operator/0.log" Mar 19 18:08:11 crc kubenswrapper[4918]: I0319 18:08:11.387698 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qg8xl_673a75db-e104-419e-921d-99ba515af652/registry-server/0.log" Mar 19 18:08:11 crc kubenswrapper[4918]: I0319 18:08:11.496825 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-svspw_85790727-be18-4730-81e1-84022d4cead2/manager/0.log" Mar 19 18:08:11 crc kubenswrapper[4918]: I0319 18:08:11.788967 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-8sxgk_f0824526-4903-49f0-bfcb-17298cc84eb6/manager/0.log" Mar 19 18:08:12 crc kubenswrapper[4918]: I0319 18:08:12.049493 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6c7d9f85c5-vtcc8_2f1da636-ea7f-4828-b896-ec1c81c92623/manager/0.log" Mar 19 18:08:12 crc kubenswrapper[4918]: I0319 18:08:12.105593 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-smt77_fea29376-0fd1-419c-ae47-68b1c7a355e3/operator/0.log" Mar 19 18:08:12 crc kubenswrapper[4918]: I0319 18:08:12.284711 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-6jtnt_6d031265-e265-412e-931e-52ca3bb940b6/manager/0.log" Mar 19 18:08:13 crc kubenswrapper[4918]: I0319 18:08:13.103253 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-fv2wb_e54e9cfd-b6fe-4b00-a12f-20b153f05710/manager/0.log" Mar 19 18:08:13 crc kubenswrapper[4918]: I0319 18:08:13.112976 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-9w6lf_35067b62-32eb-4cb2-8fbd-91b82c4a38cb/manager/0.log" Mar 19 18:08:13 crc kubenswrapper[4918]: I0319 18:08:13.421274 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-78877dc965-4vmdw_c77d9bca-3548-4c60-aa31-ad1a70dac2f1/manager/0.log" Mar 19 18:08:13 crc kubenswrapper[4918]: I0319 18:08:13.587760 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:08:13 crc kubenswrapper[4918]: E0319 18:08:13.588185 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:08:25 crc kubenswrapper[4918]: I0319 18:08:25.816622 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jdxtx"] Mar 19 18:08:25 crc kubenswrapper[4918]: I0319 18:08:25.819559 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:25 crc kubenswrapper[4918]: I0319 18:08:25.832138 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdxtx"] Mar 19 18:08:25 crc kubenswrapper[4918]: I0319 18:08:25.893550 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78lbn\" (UniqueName: \"kubernetes.io/projected/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-kube-api-access-78lbn\") pod \"certified-operators-jdxtx\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:25 crc kubenswrapper[4918]: I0319 18:08:25.893598 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-utilities\") pod \"certified-operators-jdxtx\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:25 crc kubenswrapper[4918]: I0319 18:08:25.893675 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-catalog-content\") pod \"certified-operators-jdxtx\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:25 crc kubenswrapper[4918]: I0319 18:08:25.995429 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78lbn\" (UniqueName: \"kubernetes.io/projected/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-kube-api-access-78lbn\") pod \"certified-operators-jdxtx\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:25 crc kubenswrapper[4918]: I0319 18:08:25.995473 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-utilities\") pod \"certified-operators-jdxtx\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:25 crc kubenswrapper[4918]: I0319 18:08:25.995576 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-catalog-content\") pod \"certified-operators-jdxtx\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:25 crc kubenswrapper[4918]: I0319 18:08:25.996012 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-utilities\") pod \"certified-operators-jdxtx\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:25 crc kubenswrapper[4918]: I0319 18:08:25.996012 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-catalog-content\") pod \"certified-operators-jdxtx\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:26 crc kubenswrapper[4918]: I0319 18:08:26.015298 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78lbn\" (UniqueName: \"kubernetes.io/projected/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-kube-api-access-78lbn\") pod \"certified-operators-jdxtx\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:26 crc kubenswrapper[4918]: I0319 18:08:26.141598 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:26 crc kubenswrapper[4918]: I0319 18:08:26.588868 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:08:26 crc kubenswrapper[4918]: E0319 18:08:26.589670 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:08:26 crc kubenswrapper[4918]: I0319 18:08:26.865372 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdxtx"] Mar 19 18:08:27 crc kubenswrapper[4918]: I0319 18:08:27.880022 4918 generic.go:334] "Generic (PLEG): container finished" podID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" containerID="e54c6f0fc297e85563463326663d578fca5101afdfb8a6ca59ccf03273d657b8" exitCode=0 Mar 19 18:08:27 crc kubenswrapper[4918]: I0319 18:08:27.880087 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdxtx" event={"ID":"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4","Type":"ContainerDied","Data":"e54c6f0fc297e85563463326663d578fca5101afdfb8a6ca59ccf03273d657b8"} Mar 19 18:08:27 crc kubenswrapper[4918]: I0319 18:08:27.880451 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdxtx" event={"ID":"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4","Type":"ContainerStarted","Data":"78376246dd76cb7b9cd7bafd13c1a32c363f0e2f53b3fec130b8016d18d56db9"} Mar 19 18:08:28 crc kubenswrapper[4918]: I0319 18:08:28.890560 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdxtx" event={"ID":"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4","Type":"ContainerStarted","Data":"6ab15e053f1f0d81b480b39a7776e24110679f31ed03f5f47aa43d976c3251cc"} Mar 19 18:08:30 crc kubenswrapper[4918]: I0319 18:08:30.908247 4918 generic.go:334] "Generic (PLEG): container finished" podID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" containerID="6ab15e053f1f0d81b480b39a7776e24110679f31ed03f5f47aa43d976c3251cc" exitCode=0 Mar 19 18:08:30 crc kubenswrapper[4918]: I0319 18:08:30.908366 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdxtx" event={"ID":"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4","Type":"ContainerDied","Data":"6ab15e053f1f0d81b480b39a7776e24110679f31ed03f5f47aa43d976c3251cc"} Mar 19 18:08:31 crc kubenswrapper[4918]: I0319 18:08:31.944657 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdxtx" event={"ID":"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4","Type":"ContainerStarted","Data":"56608fed6e559a6c9283df8311ad61421d35430871463d942b8bac300d4f80d7"} Mar 19 18:08:31 crc kubenswrapper[4918]: I0319 18:08:31.977547 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jdxtx" podStartSLOduration=3.351443365 podStartE2EDuration="6.977529799s" podCreationTimestamp="2026-03-19 18:08:25 +0000 UTC" firstStartedPulling="2026-03-19 18:08:27.882157129 +0000 UTC m=+5320.004356377" lastFinishedPulling="2026-03-19 18:08:31.508243563 +0000 UTC m=+5323.630442811" observedRunningTime="2026-03-19 18:08:31.973491828 +0000 UTC m=+5324.095691086" watchObservedRunningTime="2026-03-19 18:08:31.977529799 +0000 UTC m=+5324.099729047" Mar 19 18:08:34 crc kubenswrapper[4918]: I0319 18:08:34.001722 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565728-4mb6s" event={"ID":"7f160e5d-a89e-484d-b11c-7095a366b452","Type":"ContainerStarted","Data":"c4bf4f429cb4a5ef6b12bc572841418261932194d696b0272b165b33c915539f"} Mar 19 18:08:34 crc kubenswrapper[4918]: I0319 18:08:34.018868 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565728-4mb6s" podStartSLOduration=2.025907269 podStartE2EDuration="34.018854389s" podCreationTimestamp="2026-03-19 18:08:00 +0000 UTC" firstStartedPulling="2026-03-19 18:08:01.610395526 +0000 UTC m=+5293.732594774" lastFinishedPulling="2026-03-19 18:08:33.603342646 +0000 UTC m=+5325.725541894" observedRunningTime="2026-03-19 18:08:34.015911899 +0000 UTC m=+5326.138111147" watchObservedRunningTime="2026-03-19 18:08:34.018854389 +0000 UTC m=+5326.141053637" Mar 19 18:08:35 crc kubenswrapper[4918]: I0319 18:08:35.013536 4918 generic.go:334] "Generic (PLEG): container finished" podID="7f160e5d-a89e-484d-b11c-7095a366b452" containerID="c4bf4f429cb4a5ef6b12bc572841418261932194d696b0272b165b33c915539f" exitCode=0 Mar 19 18:08:35 crc kubenswrapper[4918]: I0319 18:08:35.013745 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565728-4mb6s" event={"ID":"7f160e5d-a89e-484d-b11c-7095a366b452","Type":"ContainerDied","Data":"c4bf4f429cb4a5ef6b12bc572841418261932194d696b0272b165b33c915539f"} Mar 19 18:08:36 crc kubenswrapper[4918]: I0319 18:08:36.141735 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:36 crc kubenswrapper[4918]: I0319 18:08:36.142076 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:36 crc kubenswrapper[4918]: I0319 18:08:36.204720 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:37 crc kubenswrapper[4918]: I0319 18:08:37.189887 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:37 crc kubenswrapper[4918]: I0319 18:08:37.201040 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565728-4mb6s" Mar 19 18:08:37 crc kubenswrapper[4918]: I0319 18:08:37.259492 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdxtx"] Mar 19 18:08:37 crc kubenswrapper[4918]: I0319 18:08:37.328181 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxmsb\" (UniqueName: \"kubernetes.io/projected/7f160e5d-a89e-484d-b11c-7095a366b452-kube-api-access-rxmsb\") pod \"7f160e5d-a89e-484d-b11c-7095a366b452\" (UID: \"7f160e5d-a89e-484d-b11c-7095a366b452\") " Mar 19 18:08:37 crc kubenswrapper[4918]: I0319 18:08:37.333795 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f160e5d-a89e-484d-b11c-7095a366b452-kube-api-access-rxmsb" (OuterVolumeSpecName: "kube-api-access-rxmsb") pod "7f160e5d-a89e-484d-b11c-7095a366b452" (UID: "7f160e5d-a89e-484d-b11c-7095a366b452"). InnerVolumeSpecName "kube-api-access-rxmsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:08:37 crc kubenswrapper[4918]: I0319 18:08:37.430680 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxmsb\" (UniqueName: \"kubernetes.io/projected/7f160e5d-a89e-484d-b11c-7095a366b452-kube-api-access-rxmsb\") on node \"crc\" DevicePath \"\"" Mar 19 18:08:38 crc kubenswrapper[4918]: I0319 18:08:38.087531 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565728-4mb6s" Mar 19 18:08:38 crc kubenswrapper[4918]: I0319 18:08:38.087955 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565728-4mb6s" event={"ID":"7f160e5d-a89e-484d-b11c-7095a366b452","Type":"ContainerDied","Data":"aed9c3444477798ba2bbc3545fcbea1cd3641c3f585febc8c978e5be7dd9dbd5"} Mar 19 18:08:38 crc kubenswrapper[4918]: I0319 18:08:38.087994 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed9c3444477798ba2bbc3545fcbea1cd3641c3f585febc8c978e5be7dd9dbd5" Mar 19 18:08:38 crc kubenswrapper[4918]: I0319 18:08:38.314711 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565722-6jjkw"] Mar 19 18:08:38 crc kubenswrapper[4918]: I0319 18:08:38.328992 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565722-6jjkw"] Mar 19 18:08:38 crc kubenswrapper[4918]: I0319 18:08:38.597632 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d3562a-1119-409a-9c26-fd901a6c06ef" path="/var/lib/kubelet/pods/41d3562a-1119-409a-9c26-fd901a6c06ef/volumes" Mar 19 18:08:39 crc kubenswrapper[4918]: I0319 18:08:39.094553 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jdxtx" podUID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" containerName="registry-server" containerID="cri-o://56608fed6e559a6c9283df8311ad61421d35430871463d942b8bac300d4f80d7" gracePeriod=2 Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.104899 4918 generic.go:334] "Generic (PLEG): container finished" podID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" containerID="56608fed6e559a6c9283df8311ad61421d35430871463d942b8bac300d4f80d7" exitCode=0 Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.104980 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdxtx" event={"ID":"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4","Type":"ContainerDied","Data":"56608fed6e559a6c9283df8311ad61421d35430871463d942b8bac300d4f80d7"} Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.300656 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.490318 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-utilities\") pod \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.490487 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78lbn\" (UniqueName: \"kubernetes.io/projected/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-kube-api-access-78lbn\") pod \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.490575 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-catalog-content\") pod \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\" (UID: \"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4\") " Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.490897 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-utilities" (OuterVolumeSpecName: "utilities") pod "2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" (UID: "2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.491294 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.496502 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-kube-api-access-78lbn" (OuterVolumeSpecName: "kube-api-access-78lbn") pod "2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" (UID: "2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4"). InnerVolumeSpecName "kube-api-access-78lbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.543199 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" (UID: "2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.593582 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78lbn\" (UniqueName: \"kubernetes.io/projected/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-kube-api-access-78lbn\") on node \"crc\" DevicePath \"\"" Mar 19 18:08:40 crc kubenswrapper[4918]: I0319 18:08:40.593619 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:08:41 crc kubenswrapper[4918]: I0319 18:08:41.123341 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdxtx" event={"ID":"2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4","Type":"ContainerDied","Data":"78376246dd76cb7b9cd7bafd13c1a32c363f0e2f53b3fec130b8016d18d56db9"} Mar 19 18:08:41 crc kubenswrapper[4918]: I0319 18:08:41.123626 4918 scope.go:117] "RemoveContainer" containerID="56608fed6e559a6c9283df8311ad61421d35430871463d942b8bac300d4f80d7" Mar 19 18:08:41 crc kubenswrapper[4918]: I0319 18:08:41.123795 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdxtx" Mar 19 18:08:41 crc kubenswrapper[4918]: I0319 18:08:41.148900 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdxtx"] Mar 19 18:08:41 crc kubenswrapper[4918]: I0319 18:08:41.150980 4918 scope.go:117] "RemoveContainer" containerID="6ab15e053f1f0d81b480b39a7776e24110679f31ed03f5f47aa43d976c3251cc" Mar 19 18:08:41 crc kubenswrapper[4918]: I0319 18:08:41.161240 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jdxtx"] Mar 19 18:08:41 crc kubenswrapper[4918]: I0319 18:08:41.173132 4918 scope.go:117] "RemoveContainer" containerID="e54c6f0fc297e85563463326663d578fca5101afdfb8a6ca59ccf03273d657b8" Mar 19 18:08:41 crc kubenswrapper[4918]: I0319 18:08:41.587061 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:08:41 crc kubenswrapper[4918]: E0319 18:08:41.587397 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:08:42 crc kubenswrapper[4918]: I0319 18:08:42.608974 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" path="/var/lib/kubelet/pods/2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4/volumes" Mar 19 18:08:55 crc kubenswrapper[4918]: I0319 18:08:55.586782 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:08:55 crc kubenswrapper[4918]: E0319 18:08:55.587443 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:08:57 crc kubenswrapper[4918]: I0319 18:08:57.359787 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pz8dd_d9b7f6a4-5987-4b92-b063-2ddf9ad42074/kube-rbac-proxy/0.log" Mar 19 18:08:57 crc kubenswrapper[4918]: I0319 18:08:57.426634 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7lqkl_9ff332c4-5e3a-4d2d-a694-870559724211/control-plane-machine-set-operator/0.log" Mar 19 18:08:57 crc kubenswrapper[4918]: I0319 18:08:57.762427 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pz8dd_d9b7f6a4-5987-4b92-b063-2ddf9ad42074/machine-api-operator/0.log" Mar 19 18:09:06 crc kubenswrapper[4918]: I0319 18:09:06.587442 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:09:07 crc kubenswrapper[4918]: I0319 18:09:07.380626 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"3538084f1003e436351b0534c29b6fa462ec81baa17af0111e7937ea959e446c"} Mar 19 18:09:16 crc kubenswrapper[4918]: I0319 18:09:16.440540 4918 scope.go:117] "RemoveContainer" containerID="58b6dcf9f2195622b45eac9e9772e5792fa15a5cbed9071ed4ef8f9d986aeb78" Mar 19 18:09:23 crc kubenswrapper[4918]: I0319 18:09:23.548306 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-nrk5x_acfafa9a-b5e2-4385-8820-dcc726d38058/cert-manager-controller/0.log" Mar 19 18:09:23 crc kubenswrapper[4918]: I0319 18:09:23.889197 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-4fns8_308372f5-f777-42d6-a57d-76ea6928f45e/cert-manager-cainjector/0.log" Mar 19 18:09:24 crc kubenswrapper[4918]: I0319 18:09:24.144024 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-6frc2_e7570f30-edef-4862-bc00-e72ce89ff640/cert-manager-webhook/0.log" Mar 19 18:09:49 crc kubenswrapper[4918]: I0319 18:09:49.731050 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-jd2lp_6982f526-6a77-4d0c-91e5-cf2714c78706/nmstate-console-plugin/0.log" Mar 19 18:09:49 crc kubenswrapper[4918]: I0319 18:09:49.934529 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-c8b48_f68d263f-a5cd-4f12-ba53-0179e79cff40/nmstate-handler/0.log" Mar 19 18:09:50 crc kubenswrapper[4918]: I0319 18:09:50.019409 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-6rwfd_8394065e-bb9c-483d-b1bb-5f10bd07d0c4/kube-rbac-proxy/0.log" Mar 19 18:09:50 crc kubenswrapper[4918]: I0319 18:09:50.107324 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-6rwfd_8394065e-bb9c-483d-b1bb-5f10bd07d0c4/nmstate-metrics/0.log" Mar 19 18:09:50 crc kubenswrapper[4918]: I0319 18:09:50.420537 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-ngkkz_12300214-878a-4066-9155-654f4a1c6e88/nmstate-operator/0.log" Mar 19 18:09:50 crc kubenswrapper[4918]: I0319 18:09:50.483762 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-g7jgg_3eaea7a7-ad66-4cd2-9678-df63c825a501/nmstate-webhook/0.log" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.141264 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565730-7ds7j"] Mar 19 18:10:00 crc kubenswrapper[4918]: E0319 18:10:00.142127 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" containerName="registry-server" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.142139 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" containerName="registry-server" Mar 19 18:10:00 crc kubenswrapper[4918]: E0319 18:10:00.142162 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f160e5d-a89e-484d-b11c-7095a366b452" containerName="oc" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.142168 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f160e5d-a89e-484d-b11c-7095a366b452" containerName="oc" Mar 19 18:10:00 crc kubenswrapper[4918]: E0319 18:10:00.142193 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" containerName="extract-content" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.142199 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" containerName="extract-content" Mar 19 18:10:00 crc kubenswrapper[4918]: E0319 18:10:00.142211 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" containerName="extract-utilities" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.142218 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" containerName="extract-utilities" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.142405 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f9ea5bd-3e5b-4089-80cf-0f5d45cf04e4" containerName="registry-server" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.142416 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f160e5d-a89e-484d-b11c-7095a366b452" containerName="oc" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.143158 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565730-7ds7j" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.145287 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.145743 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.146156 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.157824 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565730-7ds7j"] Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.234634 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkgt8\" (UniqueName: \"kubernetes.io/projected/8c006477-ed74-4a21-b0c3-8ebebdb298bf-kube-api-access-qkgt8\") pod \"auto-csr-approver-29565730-7ds7j\" (UID: \"8c006477-ed74-4a21-b0c3-8ebebdb298bf\") " pod="openshift-infra/auto-csr-approver-29565730-7ds7j" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.336430 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkgt8\" (UniqueName: \"kubernetes.io/projected/8c006477-ed74-4a21-b0c3-8ebebdb298bf-kube-api-access-qkgt8\") pod \"auto-csr-approver-29565730-7ds7j\" (UID: \"8c006477-ed74-4a21-b0c3-8ebebdb298bf\") " pod="openshift-infra/auto-csr-approver-29565730-7ds7j" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.355657 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkgt8\" (UniqueName: \"kubernetes.io/projected/8c006477-ed74-4a21-b0c3-8ebebdb298bf-kube-api-access-qkgt8\") pod \"auto-csr-approver-29565730-7ds7j\" (UID: \"8c006477-ed74-4a21-b0c3-8ebebdb298bf\") " pod="openshift-infra/auto-csr-approver-29565730-7ds7j" Mar 19 18:10:00 crc kubenswrapper[4918]: I0319 18:10:00.459666 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565730-7ds7j" Mar 19 18:10:01 crc kubenswrapper[4918]: I0319 18:10:01.174283 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565730-7ds7j"] Mar 19 18:10:01 crc kubenswrapper[4918]: I0319 18:10:01.852910 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565730-7ds7j" event={"ID":"8c006477-ed74-4a21-b0c3-8ebebdb298bf","Type":"ContainerStarted","Data":"e93936e951723d31627475c62b429e418a0ee0a230777e5fa5c2fb34e86b23d4"} Mar 19 18:10:03 crc kubenswrapper[4918]: I0319 18:10:03.869677 4918 generic.go:334] "Generic (PLEG): container finished" podID="8c006477-ed74-4a21-b0c3-8ebebdb298bf" containerID="9aec64b73fdfe8189abe6c8aff009369bfe2a637c7112e669d7307490f43ad1a" exitCode=0 Mar 19 18:10:03 crc kubenswrapper[4918]: I0319 18:10:03.870089 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565730-7ds7j" event={"ID":"8c006477-ed74-4a21-b0c3-8ebebdb298bf","Type":"ContainerDied","Data":"9aec64b73fdfe8189abe6c8aff009369bfe2a637c7112e669d7307490f43ad1a"} Mar 19 18:10:06 crc kubenswrapper[4918]: I0319 18:10:06.149345 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565730-7ds7j" Mar 19 18:10:06 crc kubenswrapper[4918]: I0319 18:10:06.255037 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkgt8\" (UniqueName: \"kubernetes.io/projected/8c006477-ed74-4a21-b0c3-8ebebdb298bf-kube-api-access-qkgt8\") pod \"8c006477-ed74-4a21-b0c3-8ebebdb298bf\" (UID: \"8c006477-ed74-4a21-b0c3-8ebebdb298bf\") " Mar 19 18:10:06 crc kubenswrapper[4918]: I0319 18:10:06.261990 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c006477-ed74-4a21-b0c3-8ebebdb298bf-kube-api-access-qkgt8" (OuterVolumeSpecName: "kube-api-access-qkgt8") pod "8c006477-ed74-4a21-b0c3-8ebebdb298bf" (UID: "8c006477-ed74-4a21-b0c3-8ebebdb298bf"). InnerVolumeSpecName "kube-api-access-qkgt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:10:06 crc kubenswrapper[4918]: I0319 18:10:06.358001 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkgt8\" (UniqueName: \"kubernetes.io/projected/8c006477-ed74-4a21-b0c3-8ebebdb298bf-kube-api-access-qkgt8\") on node \"crc\" DevicePath \"\"" Mar 19 18:10:06 crc kubenswrapper[4918]: E0319 18:10:06.804065 4918 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c006477_ed74_4a21_b0c3_8ebebdb298bf.slice\": RecentStats: unable to find data in memory cache]" Mar 19 18:10:06 crc kubenswrapper[4918]: I0319 18:10:06.904403 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565730-7ds7j" event={"ID":"8c006477-ed74-4a21-b0c3-8ebebdb298bf","Type":"ContainerDied","Data":"e93936e951723d31627475c62b429e418a0ee0a230777e5fa5c2fb34e86b23d4"} Mar 19 18:10:06 crc kubenswrapper[4918]: I0319 18:10:06.904644 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e93936e951723d31627475c62b429e418a0ee0a230777e5fa5c2fb34e86b23d4" Mar 19 18:10:06 crc kubenswrapper[4918]: I0319 18:10:06.904480 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565730-7ds7j" Mar 19 18:10:07 crc kubenswrapper[4918]: I0319 18:10:07.228827 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565724-mrmnb"] Mar 19 18:10:07 crc kubenswrapper[4918]: I0319 18:10:07.239228 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565724-mrmnb"] Mar 19 18:10:08 crc kubenswrapper[4918]: I0319 18:10:08.597863 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f306c4a3-fdd1-434c-b4cc-9847c22a3e89" path="/var/lib/kubelet/pods/f306c4a3-fdd1-434c-b4cc-9847c22a3e89/volumes" Mar 19 18:10:16 crc kubenswrapper[4918]: I0319 18:10:16.143942 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5d575bbbdc-jmzdk_4c7a4c16-5343-41a6-a08a-e939ac44aa1a/kube-rbac-proxy/0.log" Mar 19 18:10:16 crc kubenswrapper[4918]: I0319 18:10:16.179949 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5d575bbbdc-jmzdk_4c7a4c16-5343-41a6-a08a-e939ac44aa1a/manager/0.log" Mar 19 18:10:16 crc kubenswrapper[4918]: I0319 18:10:16.979851 4918 scope.go:117] "RemoveContainer" containerID="3260cf79358d36263154a35eb700bde10ddf4bf8c1cc6a138b31d6bcf0165c93" Mar 19 18:10:41 crc kubenswrapper[4918]: I0319 18:10:41.569142 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-4j7mv_7474b087-0f69-4aaa-a238-d3a0c1cee280/prometheus-operator/0.log" Mar 19 18:10:41 crc kubenswrapper[4918]: I0319 18:10:41.796669 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg_ca575a9e-34c3-4601-9d7c-a0033202f67c/prometheus-operator-admission-webhook/0.log" Mar 19 18:10:42 crc kubenswrapper[4918]: I0319 18:10:42.105476 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-lt8g5_c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2/operator/0.log" Mar 19 18:10:42 crc kubenswrapper[4918]: I0319 18:10:42.170884 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8489576f5f-zlq67_e20191af-404b-4afc-ba99-628844e9cf89/prometheus-operator-admission-webhook/0.log" Mar 19 18:10:42 crc kubenswrapper[4918]: I0319 18:10:42.397250 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-6879549846-2tnz2_5ba838ce-ff3e-4419-b143-b42b064b28fd/perses-operator/0.log" Mar 19 18:11:12 crc kubenswrapper[4918]: I0319 18:11:12.584778 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-pdrw9_991062e5-113f-40ff-9980-02cc5d5f70e0/frr-k8s-webhook-server/0.log" Mar 19 18:11:12 crc kubenswrapper[4918]: I0319 18:11:12.921173 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-frr-files/0.log" Mar 19 18:11:13 crc kubenswrapper[4918]: I0319 18:11:13.051554 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-xvdsx_9786a14d-5680-49c8-9e93-764b32a73202/kube-rbac-proxy/0.log" Mar 19 18:11:13 crc kubenswrapper[4918]: I0319 18:11:13.148917 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-xvdsx_9786a14d-5680-49c8-9e93-764b32a73202/controller/0.log" Mar 19 18:11:13 crc kubenswrapper[4918]: I0319 18:11:13.391525 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-frr-files/0.log" Mar 19 18:11:13 crc kubenswrapper[4918]: I0319 18:11:13.435226 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-reloader/0.log" Mar 19 18:11:13 crc kubenswrapper[4918]: I0319 18:11:13.488980 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-metrics/0.log" Mar 19 18:11:13 crc kubenswrapper[4918]: I0319 18:11:13.775755 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-reloader/0.log" Mar 19 18:11:14 crc kubenswrapper[4918]: I0319 18:11:14.237603 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-reloader/0.log" Mar 19 18:11:14 crc kubenswrapper[4918]: I0319 18:11:14.287382 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-frr-files/0.log" Mar 19 18:11:14 crc kubenswrapper[4918]: I0319 18:11:14.338626 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-metrics/0.log" Mar 19 18:11:14 crc kubenswrapper[4918]: I0319 18:11:14.427022 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-metrics/0.log" Mar 19 18:11:14 crc kubenswrapper[4918]: I0319 18:11:14.649559 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-reloader/0.log" Mar 19 18:11:14 crc kubenswrapper[4918]: I0319 18:11:14.711585 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/controller/0.log" Mar 19 18:11:14 crc kubenswrapper[4918]: I0319 18:11:14.859012 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-metrics/0.log" Mar 19 18:11:14 crc kubenswrapper[4918]: I0319 18:11:14.903039 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/cp-frr-files/0.log" Mar 19 18:11:15 crc kubenswrapper[4918]: I0319 18:11:15.350878 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/frr-metrics/0.log" Mar 19 18:11:15 crc kubenswrapper[4918]: I0319 18:11:15.530588 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/kube-rbac-proxy/0.log" Mar 19 18:11:15 crc kubenswrapper[4918]: I0319 18:11:15.646592 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/reloader/0.log" Mar 19 18:11:15 crc kubenswrapper[4918]: I0319 18:11:15.786268 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/kube-rbac-proxy-frr/0.log" Mar 19 18:11:16 crc kubenswrapper[4918]: I0319 18:11:16.206345 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-67cf8697d8-q2jfk_9fce7235-ee0c-4b6f-a51d-418382810cb2/manager/0.log" Mar 19 18:11:16 crc kubenswrapper[4918]: I0319 18:11:16.313163 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7b4479595-2gx99_6aa9115f-c65a-4e97-a868-4972a2a730e4/webhook-server/0.log" Mar 19 18:11:16 crc kubenswrapper[4918]: I0319 18:11:16.393723 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zffq8_b5336426-61ac-4019-ac70-31c129c9a939/frr/0.log" Mar 19 18:11:16 crc kubenswrapper[4918]: I0319 18:11:16.665820 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6jjlw_9d3715c6-f9c8-4863-9929-d804880ae4f7/kube-rbac-proxy/0.log" Mar 19 18:11:17 crc kubenswrapper[4918]: I0319 18:11:17.082502 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6jjlw_9d3715c6-f9c8-4863-9929-d804880ae4f7/speaker/0.log" Mar 19 18:11:17 crc kubenswrapper[4918]: I0319 18:11:17.116589 4918 scope.go:117] "RemoveContainer" containerID="c9ab0de4d2d6b00e46f466b40a4d93b6d221c0e9eed14b06450bc30735ed79a8" Mar 19 18:11:28 crc kubenswrapper[4918]: I0319 18:11:28.211728 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:11:28 crc kubenswrapper[4918]: I0319 18:11:28.212172 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:11:44 crc kubenswrapper[4918]: I0319 18:11:44.530152 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn_a33b8fae-e1c3-428a-b08e-1afb2e142412/util/0.log" Mar 19 18:11:44 crc kubenswrapper[4918]: I0319 18:11:44.844743 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn_a33b8fae-e1c3-428a-b08e-1afb2e142412/pull/0.log" Mar 19 18:11:44 crc kubenswrapper[4918]: I0319 18:11:44.926077 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn_a33b8fae-e1c3-428a-b08e-1afb2e142412/util/0.log" Mar 19 18:11:45 crc kubenswrapper[4918]: I0319 18:11:45.019373 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn_a33b8fae-e1c3-428a-b08e-1afb2e142412/pull/0.log" Mar 19 18:11:45 crc kubenswrapper[4918]: I0319 18:11:45.275326 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn_a33b8fae-e1c3-428a-b08e-1afb2e142412/extract/0.log" Mar 19 18:11:45 crc kubenswrapper[4918]: I0319 18:11:45.328931 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn_a33b8fae-e1c3-428a-b08e-1afb2e142412/pull/0.log" Mar 19 18:11:45 crc kubenswrapper[4918]: I0319 18:11:45.351741 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhnqn_a33b8fae-e1c3-428a-b08e-1afb2e142412/util/0.log" Mar 19 18:11:45 crc kubenswrapper[4918]: I0319 18:11:45.695748 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n_155d539d-8319-43c6-9652-1af4e68bfe13/util/0.log" Mar 19 18:11:46 crc kubenswrapper[4918]: I0319 18:11:46.186783 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n_155d539d-8319-43c6-9652-1af4e68bfe13/util/0.log" Mar 19 18:11:46 crc kubenswrapper[4918]: I0319 18:11:46.218085 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n_155d539d-8319-43c6-9652-1af4e68bfe13/pull/0.log" Mar 19 18:11:46 crc kubenswrapper[4918]: I0319 18:11:46.352589 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n_155d539d-8319-43c6-9652-1af4e68bfe13/pull/0.log" Mar 19 18:11:46 crc kubenswrapper[4918]: I0319 18:11:46.509971 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n_155d539d-8319-43c6-9652-1af4e68bfe13/pull/0.log" Mar 19 18:11:46 crc kubenswrapper[4918]: I0319 18:11:46.744288 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n_155d539d-8319-43c6-9652-1af4e68bfe13/util/0.log" Mar 19 18:11:46 crc kubenswrapper[4918]: I0319 18:11:46.757416 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1d8r7n_155d539d-8319-43c6-9652-1af4e68bfe13/extract/0.log" Mar 19 18:11:46 crc kubenswrapper[4918]: I0319 18:11:46.984417 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9_584f93a6-d456-4bc0-89f1-71eef948d233/util/0.log" Mar 19 18:11:47 crc kubenswrapper[4918]: I0319 18:11:47.451471 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9_584f93a6-d456-4bc0-89f1-71eef948d233/util/0.log" Mar 19 18:11:47 crc kubenswrapper[4918]: I0319 18:11:47.564956 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9_584f93a6-d456-4bc0-89f1-71eef948d233/pull/0.log" Mar 19 18:11:47 crc kubenswrapper[4918]: I0319 18:11:47.575060 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9_584f93a6-d456-4bc0-89f1-71eef948d233/pull/0.log" Mar 19 18:11:47 crc kubenswrapper[4918]: I0319 18:11:47.794676 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9_584f93a6-d456-4bc0-89f1-71eef948d233/util/0.log" Mar 19 18:11:47 crc kubenswrapper[4918]: I0319 18:11:47.896313 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9_584f93a6-d456-4bc0-89f1-71eef948d233/extract/0.log" Mar 19 18:11:47 crc kubenswrapper[4918]: I0319 18:11:47.942568 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726295t9_584f93a6-d456-4bc0-89f1-71eef948d233/pull/0.log" Mar 19 18:11:48 crc kubenswrapper[4918]: I0319 18:11:48.390506 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7_a4b4c7cb-18de-4527-a4b5-859f45243567/util/0.log" Mar 19 18:11:48 crc kubenswrapper[4918]: I0319 18:11:48.788635 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7_a4b4c7cb-18de-4527-a4b5-859f45243567/util/0.log" Mar 19 18:11:48 crc kubenswrapper[4918]: I0319 18:11:48.826354 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7_a4b4c7cb-18de-4527-a4b5-859f45243567/pull/0.log" Mar 19 18:11:48 crc kubenswrapper[4918]: I0319 18:11:48.912304 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7_a4b4c7cb-18de-4527-a4b5-859f45243567/pull/0.log" Mar 19 18:11:49 crc kubenswrapper[4918]: I0319 18:11:49.215500 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7_a4b4c7cb-18de-4527-a4b5-859f45243567/extract/0.log" Mar 19 18:11:49 crc kubenswrapper[4918]: I0319 18:11:49.227130 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7_a4b4c7cb-18de-4527-a4b5-859f45243567/util/0.log" Mar 19 18:11:49 crc kubenswrapper[4918]: I0319 18:11:49.266140 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_b42be0e84d359797ca2a6e73d6fbaa3b214e20650a24a735d2f27f15fcvmlj7_a4b4c7cb-18de-4527-a4b5-859f45243567/pull/0.log" Mar 19 18:11:49 crc kubenswrapper[4918]: I0319 18:11:49.589087 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pvrk8_310a0fb9-d1a0-42e4-bf28-242a240c788b/extract-utilities/0.log" Mar 19 18:11:50 crc kubenswrapper[4918]: I0319 18:11:50.178625 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pvrk8_310a0fb9-d1a0-42e4-bf28-242a240c788b/extract-content/0.log" Mar 19 18:11:50 crc kubenswrapper[4918]: I0319 18:11:50.228736 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pvrk8_310a0fb9-d1a0-42e4-bf28-242a240c788b/extract-content/0.log" Mar 19 18:11:50 crc kubenswrapper[4918]: I0319 18:11:50.246784 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pvrk8_310a0fb9-d1a0-42e4-bf28-242a240c788b/extract-utilities/0.log" Mar 19 18:11:50 crc kubenswrapper[4918]: I0319 18:11:50.681400 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pvrk8_310a0fb9-d1a0-42e4-bf28-242a240c788b/extract-utilities/0.log" Mar 19 18:11:50 crc kubenswrapper[4918]: I0319 18:11:50.701812 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pvrk8_310a0fb9-d1a0-42e4-bf28-242a240c788b/extract-content/0.log" Mar 19 18:11:50 crc kubenswrapper[4918]: I0319 18:11:50.861150 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z2gzl_37f55ac3-175f-483c-83f7-125e90cee899/extract-utilities/0.log" Mar 19 18:11:51 crc kubenswrapper[4918]: I0319 18:11:51.128854 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z2gzl_37f55ac3-175f-483c-83f7-125e90cee899/extract-utilities/0.log" Mar 19 18:11:51 crc kubenswrapper[4918]: I0319 18:11:51.272761 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z2gzl_37f55ac3-175f-483c-83f7-125e90cee899/extract-content/0.log" Mar 19 18:11:51 crc kubenswrapper[4918]: I0319 18:11:51.423199 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z2gzl_37f55ac3-175f-483c-83f7-125e90cee899/extract-content/0.log" Mar 19 18:11:51 crc kubenswrapper[4918]: I0319 18:11:51.439786 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pvrk8_310a0fb9-d1a0-42e4-bf28-242a240c788b/registry-server/0.log" Mar 19 18:11:51 crc kubenswrapper[4918]: I0319 18:11:51.492039 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z2gzl_37f55ac3-175f-483c-83f7-125e90cee899/extract-utilities/0.log" Mar 19 18:11:51 crc kubenswrapper[4918]: I0319 18:11:51.502583 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z2gzl_37f55ac3-175f-483c-83f7-125e90cee899/extract-content/0.log" Mar 19 18:11:52 crc kubenswrapper[4918]: I0319 18:11:52.238701 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zzhsz_8522aab6-de35-499c-bfe1-55ff5c72fbc6/extract-utilities/0.log" Mar 19 18:11:52 crc kubenswrapper[4918]: I0319 18:11:52.254210 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fv5bd_028abe48-67cc-4c05-b8ae-cd5979b55787/marketplace-operator/0.log" Mar 19 18:11:52 crc kubenswrapper[4918]: I0319 18:11:52.468178 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zzhsz_8522aab6-de35-499c-bfe1-55ff5c72fbc6/extract-utilities/0.log" Mar 19 18:11:52 crc kubenswrapper[4918]: I0319 18:11:52.655756 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z2gzl_37f55ac3-175f-483c-83f7-125e90cee899/registry-server/0.log" Mar 19 18:11:52 crc kubenswrapper[4918]: I0319 18:11:52.721612 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zzhsz_8522aab6-de35-499c-bfe1-55ff5c72fbc6/extract-content/0.log" Mar 19 18:11:52 crc kubenswrapper[4918]: I0319 18:11:52.855234 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zzhsz_8522aab6-de35-499c-bfe1-55ff5c72fbc6/extract-content/0.log" Mar 19 18:11:53 crc kubenswrapper[4918]: I0319 18:11:53.262394 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zzhsz_8522aab6-de35-499c-bfe1-55ff5c72fbc6/extract-content/0.log" Mar 19 18:11:53 crc kubenswrapper[4918]: I0319 18:11:53.281702 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zzhsz_8522aab6-de35-499c-bfe1-55ff5c72fbc6/extract-utilities/0.log" Mar 19 18:11:53 crc kubenswrapper[4918]: I0319 18:11:53.424948 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmkmp_30a1a963-759d-404c-8e73-0df8c5a73a59/extract-utilities/0.log" Mar 19 18:11:53 crc kubenswrapper[4918]: I0319 18:11:53.439699 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zzhsz_8522aab6-de35-499c-bfe1-55ff5c72fbc6/registry-server/0.log" Mar 19 18:11:53 crc kubenswrapper[4918]: I0319 18:11:53.627238 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmkmp_30a1a963-759d-404c-8e73-0df8c5a73a59/extract-utilities/0.log" Mar 19 18:11:53 crc kubenswrapper[4918]: I0319 18:11:53.670201 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmkmp_30a1a963-759d-404c-8e73-0df8c5a73a59/extract-content/0.log" Mar 19 18:11:53 crc kubenswrapper[4918]: I0319 18:11:53.740981 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmkmp_30a1a963-759d-404c-8e73-0df8c5a73a59/extract-content/0.log" Mar 19 18:11:54 crc kubenswrapper[4918]: I0319 18:11:54.231155 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmkmp_30a1a963-759d-404c-8e73-0df8c5a73a59/extract-utilities/0.log" Mar 19 18:11:54 crc kubenswrapper[4918]: I0319 18:11:54.256197 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmkmp_30a1a963-759d-404c-8e73-0df8c5a73a59/extract-content/0.log" Mar 19 18:11:54 crc kubenswrapper[4918]: I0319 18:11:54.649196 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lmkmp_30a1a963-759d-404c-8e73-0df8c5a73a59/registry-server/0.log" Mar 19 18:11:58 crc kubenswrapper[4918]: I0319 18:11:58.212344 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:11:58 crc kubenswrapper[4918]: I0319 18:11:58.214244 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.167597 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565732-np728"] Mar 19 18:12:00 crc kubenswrapper[4918]: E0319 18:12:00.168479 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c006477-ed74-4a21-b0c3-8ebebdb298bf" containerName="oc" Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.168499 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c006477-ed74-4a21-b0c3-8ebebdb298bf" containerName="oc" Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.168765 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c006477-ed74-4a21-b0c3-8ebebdb298bf" containerName="oc" Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.169725 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565732-np728" Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.184564 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.184944 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.185116 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.191976 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565732-np728"] Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.314629 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr8h2\" (UniqueName: \"kubernetes.io/projected/d0b50232-b554-4dd1-b6a5-6fd9362afeb8-kube-api-access-mr8h2\") pod \"auto-csr-approver-29565732-np728\" (UID: \"d0b50232-b554-4dd1-b6a5-6fd9362afeb8\") " pod="openshift-infra/auto-csr-approver-29565732-np728" Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.422233 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr8h2\" (UniqueName: \"kubernetes.io/projected/d0b50232-b554-4dd1-b6a5-6fd9362afeb8-kube-api-access-mr8h2\") pod \"auto-csr-approver-29565732-np728\" (UID: \"d0b50232-b554-4dd1-b6a5-6fd9362afeb8\") " pod="openshift-infra/auto-csr-approver-29565732-np728" Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.467199 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr8h2\" (UniqueName: \"kubernetes.io/projected/d0b50232-b554-4dd1-b6a5-6fd9362afeb8-kube-api-access-mr8h2\") pod \"auto-csr-approver-29565732-np728\" (UID: \"d0b50232-b554-4dd1-b6a5-6fd9362afeb8\") " pod="openshift-infra/auto-csr-approver-29565732-np728" Mar 19 18:12:00 crc kubenswrapper[4918]: I0319 18:12:00.603323 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565732-np728" Mar 19 18:12:01 crc kubenswrapper[4918]: I0319 18:12:01.436940 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 18:12:01 crc kubenswrapper[4918]: I0319 18:12:01.440618 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565732-np728"] Mar 19 18:12:02 crc kubenswrapper[4918]: I0319 18:12:02.234740 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565732-np728" event={"ID":"d0b50232-b554-4dd1-b6a5-6fd9362afeb8","Type":"ContainerStarted","Data":"12841a46637be7dafa990009137aed5c8228d5f4988a226a4875b5ca5164f0a9"} Mar 19 18:12:03 crc kubenswrapper[4918]: I0319 18:12:03.245667 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565732-np728" event={"ID":"d0b50232-b554-4dd1-b6a5-6fd9362afeb8","Type":"ContainerStarted","Data":"ca5c3004c3bf24e2b95fabe7ff0ac085df33e943254d23b52a29c311ac87380c"} Mar 19 18:12:03 crc kubenswrapper[4918]: I0319 18:12:03.264372 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565732-np728" podStartSLOduration=2.296404855 podStartE2EDuration="3.264352671s" podCreationTimestamp="2026-03-19 18:12:00 +0000 UTC" firstStartedPulling="2026-03-19 18:12:01.436730555 +0000 UTC m=+5533.558929803" lastFinishedPulling="2026-03-19 18:12:02.404678371 +0000 UTC m=+5534.526877619" observedRunningTime="2026-03-19 18:12:03.259262092 +0000 UTC m=+5535.381461340" watchObservedRunningTime="2026-03-19 18:12:03.264352671 +0000 UTC m=+5535.386551919" Mar 19 18:12:04 crc kubenswrapper[4918]: I0319 18:12:04.255132 4918 generic.go:334] "Generic (PLEG): container finished" podID="d0b50232-b554-4dd1-b6a5-6fd9362afeb8" containerID="ca5c3004c3bf24e2b95fabe7ff0ac085df33e943254d23b52a29c311ac87380c" exitCode=0 Mar 19 18:12:04 crc kubenswrapper[4918]: I0319 18:12:04.255236 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565732-np728" event={"ID":"d0b50232-b554-4dd1-b6a5-6fd9362afeb8","Type":"ContainerDied","Data":"ca5c3004c3bf24e2b95fabe7ff0ac085df33e943254d23b52a29c311ac87380c"} Mar 19 18:12:06 crc kubenswrapper[4918]: I0319 18:12:06.484042 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565732-np728" Mar 19 18:12:06 crc kubenswrapper[4918]: I0319 18:12:06.587393 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr8h2\" (UniqueName: \"kubernetes.io/projected/d0b50232-b554-4dd1-b6a5-6fd9362afeb8-kube-api-access-mr8h2\") pod \"d0b50232-b554-4dd1-b6a5-6fd9362afeb8\" (UID: \"d0b50232-b554-4dd1-b6a5-6fd9362afeb8\") " Mar 19 18:12:06 crc kubenswrapper[4918]: I0319 18:12:06.593864 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b50232-b554-4dd1-b6a5-6fd9362afeb8-kube-api-access-mr8h2" (OuterVolumeSpecName: "kube-api-access-mr8h2") pod "d0b50232-b554-4dd1-b6a5-6fd9362afeb8" (UID: "d0b50232-b554-4dd1-b6a5-6fd9362afeb8"). InnerVolumeSpecName "kube-api-access-mr8h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:12:06 crc kubenswrapper[4918]: I0319 18:12:06.690283 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr8h2\" (UniqueName: \"kubernetes.io/projected/d0b50232-b554-4dd1-b6a5-6fd9362afeb8-kube-api-access-mr8h2\") on node \"crc\" DevicePath \"\"" Mar 19 18:12:07 crc kubenswrapper[4918]: I0319 18:12:07.284339 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565732-np728" event={"ID":"d0b50232-b554-4dd1-b6a5-6fd9362afeb8","Type":"ContainerDied","Data":"12841a46637be7dafa990009137aed5c8228d5f4988a226a4875b5ca5164f0a9"} Mar 19 18:12:07 crc kubenswrapper[4918]: I0319 18:12:07.284627 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12841a46637be7dafa990009137aed5c8228d5f4988a226a4875b5ca5164f0a9" Mar 19 18:12:07 crc kubenswrapper[4918]: I0319 18:12:07.284409 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565732-np728" Mar 19 18:12:07 crc kubenswrapper[4918]: I0319 18:12:07.559571 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565726-npqlb"] Mar 19 18:12:07 crc kubenswrapper[4918]: I0319 18:12:07.568515 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565726-npqlb"] Mar 19 18:12:08 crc kubenswrapper[4918]: I0319 18:12:08.597463 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8da44d7-a874-4005-aaaa-4c8a19cc30fa" path="/var/lib/kubelet/pods/a8da44d7-a874-4005-aaaa-4c8a19cc30fa/volumes" Mar 19 18:12:17 crc kubenswrapper[4918]: I0319 18:12:17.200399 4918 scope.go:117] "RemoveContainer" containerID="c25afef1fd7048615aa092eda2c99efd012cd097ca8e64396a8b0a0d98bae895" Mar 19 18:12:17 crc kubenswrapper[4918]: I0319 18:12:17.223490 4918 scope.go:117] "RemoveContainer" containerID="be6fa658f12f1aa21c55c400959f1e9a43844c92b20936e94e2c0a8c3a133c38" Mar 19 18:12:21 crc kubenswrapper[4918]: I0319 18:12:21.558278 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8489576f5f-zlq67_e20191af-404b-4afc-ba99-628844e9cf89/prometheus-operator-admission-webhook/0.log" Mar 19 18:12:21 crc kubenswrapper[4918]: I0319 18:12:21.579457 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-4j7mv_7474b087-0f69-4aaa-a238-d3a0c1cee280/prometheus-operator/0.log" Mar 19 18:12:21 crc kubenswrapper[4918]: I0319 18:12:21.612007 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8489576f5f-8w9fg_ca575a9e-34c3-4601-9d7c-a0033202f67c/prometheus-operator-admission-webhook/0.log" Mar 19 18:12:21 crc kubenswrapper[4918]: I0319 18:12:21.862016 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-lt8g5_c04d1ac3-0ecd-4455-a83d-8b2468d2b7c2/operator/0.log" Mar 19 18:12:22 crc kubenswrapper[4918]: I0319 18:12:22.030197 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-6879549846-2tnz2_5ba838ce-ff3e-4419-b143-b42b064b28fd/perses-operator/0.log" Mar 19 18:12:28 crc kubenswrapper[4918]: I0319 18:12:28.211414 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:12:28 crc kubenswrapper[4918]: I0319 18:12:28.211992 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:12:28 crc kubenswrapper[4918]: I0319 18:12:28.212036 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 18:12:28 crc kubenswrapper[4918]: I0319 18:12:28.212817 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3538084f1003e436351b0534c29b6fa462ec81baa17af0111e7937ea959e446c"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 18:12:28 crc kubenswrapper[4918]: I0319 18:12:28.212863 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://3538084f1003e436351b0534c29b6fa462ec81baa17af0111e7937ea959e446c" gracePeriod=600 Mar 19 18:12:28 crc kubenswrapper[4918]: I0319 18:12:28.479870 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="3538084f1003e436351b0534c29b6fa462ec81baa17af0111e7937ea959e446c" exitCode=0 Mar 19 18:12:28 crc kubenswrapper[4918]: I0319 18:12:28.479929 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"3538084f1003e436351b0534c29b6fa462ec81baa17af0111e7937ea959e446c"} Mar 19 18:12:28 crc kubenswrapper[4918]: I0319 18:12:28.480182 4918 scope.go:117] "RemoveContainer" containerID="9b757869c31a9631c82458bc289b4415023dbc453fb358cbe2d4ca58439ca0a8" Mar 19 18:12:29 crc kubenswrapper[4918]: I0319 18:12:29.500620 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerStarted","Data":"0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33"} Mar 19 18:12:46 crc kubenswrapper[4918]: I0319 18:12:46.892709 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5d575bbbdc-jmzdk_4c7a4c16-5343-41a6-a08a-e939ac44aa1a/kube-rbac-proxy/0.log" Mar 19 18:12:47 crc kubenswrapper[4918]: I0319 18:12:47.376710 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5d575bbbdc-jmzdk_4c7a4c16-5343-41a6-a08a-e939ac44aa1a/manager/0.log" Mar 19 18:13:04 crc kubenswrapper[4918]: E0319 18:13:04.584923 4918 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.142:56824->38.102.83.142:42841: write tcp 38.102.83.142:56824->38.102.83.142:42841: write: broken pipe Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.163124 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565734-mvbxr"] Mar 19 18:14:00 crc kubenswrapper[4918]: E0319 18:14:00.163981 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b50232-b554-4dd1-b6a5-6fd9362afeb8" containerName="oc" Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.163992 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b50232-b554-4dd1-b6a5-6fd9362afeb8" containerName="oc" Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.164228 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b50232-b554-4dd1-b6a5-6fd9362afeb8" containerName="oc" Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.165100 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565734-mvbxr" Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.167654 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.168089 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.168267 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.232355 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565734-mvbxr"] Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.316029 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksdcg\" (UniqueName: \"kubernetes.io/projected/693f90b7-68ff-47f9-800d-8f94e415107e-kube-api-access-ksdcg\") pod \"auto-csr-approver-29565734-mvbxr\" (UID: \"693f90b7-68ff-47f9-800d-8f94e415107e\") " pod="openshift-infra/auto-csr-approver-29565734-mvbxr" Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.417770 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksdcg\" (UniqueName: \"kubernetes.io/projected/693f90b7-68ff-47f9-800d-8f94e415107e-kube-api-access-ksdcg\") pod \"auto-csr-approver-29565734-mvbxr\" (UID: \"693f90b7-68ff-47f9-800d-8f94e415107e\") " pod="openshift-infra/auto-csr-approver-29565734-mvbxr" Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.436826 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksdcg\" (UniqueName: \"kubernetes.io/projected/693f90b7-68ff-47f9-800d-8f94e415107e-kube-api-access-ksdcg\") pod \"auto-csr-approver-29565734-mvbxr\" (UID: \"693f90b7-68ff-47f9-800d-8f94e415107e\") " pod="openshift-infra/auto-csr-approver-29565734-mvbxr" Mar 19 18:14:00 crc kubenswrapper[4918]: I0319 18:14:00.486917 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565734-mvbxr" Mar 19 18:14:01 crc kubenswrapper[4918]: I0319 18:14:01.237459 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565734-mvbxr"] Mar 19 18:14:01 crc kubenswrapper[4918]: I0319 18:14:01.393227 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565734-mvbxr" event={"ID":"693f90b7-68ff-47f9-800d-8f94e415107e","Type":"ContainerStarted","Data":"c868394205886b3198e9fdfe873697809fa7113611ae8a69f9e63faf9e338442"} Mar 19 18:14:03 crc kubenswrapper[4918]: I0319 18:14:03.411999 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565734-mvbxr" event={"ID":"693f90b7-68ff-47f9-800d-8f94e415107e","Type":"ContainerStarted","Data":"dd48d519f299d1843abff11c42ae898a30fe648eb0405bd1a19d3d9e6054607b"} Mar 19 18:14:03 crc kubenswrapper[4918]: I0319 18:14:03.429037 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565734-mvbxr" podStartSLOduration=2.251692013 podStartE2EDuration="3.429014345s" podCreationTimestamp="2026-03-19 18:14:00 +0000 UTC" firstStartedPulling="2026-03-19 18:14:01.245363076 +0000 UTC m=+5653.367562324" lastFinishedPulling="2026-03-19 18:14:02.422685408 +0000 UTC m=+5654.544884656" observedRunningTime="2026-03-19 18:14:03.428032079 +0000 UTC m=+5655.550231327" watchObservedRunningTime="2026-03-19 18:14:03.429014345 +0000 UTC m=+5655.551213603" Mar 19 18:14:04 crc kubenswrapper[4918]: I0319 18:14:04.422208 4918 generic.go:334] "Generic (PLEG): container finished" podID="693f90b7-68ff-47f9-800d-8f94e415107e" containerID="dd48d519f299d1843abff11c42ae898a30fe648eb0405bd1a19d3d9e6054607b" exitCode=0 Mar 19 18:14:04 crc kubenswrapper[4918]: I0319 18:14:04.422533 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565734-mvbxr" event={"ID":"693f90b7-68ff-47f9-800d-8f94e415107e","Type":"ContainerDied","Data":"dd48d519f299d1843abff11c42ae898a30fe648eb0405bd1a19d3d9e6054607b"} Mar 19 18:14:06 crc kubenswrapper[4918]: I0319 18:14:06.439301 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565734-mvbxr" event={"ID":"693f90b7-68ff-47f9-800d-8f94e415107e","Type":"ContainerDied","Data":"c868394205886b3198e9fdfe873697809fa7113611ae8a69f9e63faf9e338442"} Mar 19 18:14:06 crc kubenswrapper[4918]: I0319 18:14:06.439757 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c868394205886b3198e9fdfe873697809fa7113611ae8a69f9e63faf9e338442" Mar 19 18:14:06 crc kubenswrapper[4918]: I0319 18:14:06.479919 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565734-mvbxr" Mar 19 18:14:06 crc kubenswrapper[4918]: I0319 18:14:06.571339 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksdcg\" (UniqueName: \"kubernetes.io/projected/693f90b7-68ff-47f9-800d-8f94e415107e-kube-api-access-ksdcg\") pod \"693f90b7-68ff-47f9-800d-8f94e415107e\" (UID: \"693f90b7-68ff-47f9-800d-8f94e415107e\") " Mar 19 18:14:06 crc kubenswrapper[4918]: I0319 18:14:06.593227 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693f90b7-68ff-47f9-800d-8f94e415107e-kube-api-access-ksdcg" (OuterVolumeSpecName: "kube-api-access-ksdcg") pod "693f90b7-68ff-47f9-800d-8f94e415107e" (UID: "693f90b7-68ff-47f9-800d-8f94e415107e"). InnerVolumeSpecName "kube-api-access-ksdcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:14:06 crc kubenswrapper[4918]: I0319 18:14:06.674096 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksdcg\" (UniqueName: \"kubernetes.io/projected/693f90b7-68ff-47f9-800d-8f94e415107e-kube-api-access-ksdcg\") on node \"crc\" DevicePath \"\"" Mar 19 18:14:07 crc kubenswrapper[4918]: I0319 18:14:07.446591 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565734-mvbxr" Mar 19 18:14:07 crc kubenswrapper[4918]: I0319 18:14:07.567996 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565728-4mb6s"] Mar 19 18:14:07 crc kubenswrapper[4918]: I0319 18:14:07.579735 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565728-4mb6s"] Mar 19 18:14:08 crc kubenswrapper[4918]: I0319 18:14:08.606426 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f160e5d-a89e-484d-b11c-7095a366b452" path="/var/lib/kubelet/pods/7f160e5d-a89e-484d-b11c-7095a366b452/volumes" Mar 19 18:14:28 crc kubenswrapper[4918]: I0319 18:14:28.212271 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:14:28 crc kubenswrapper[4918]: I0319 18:14:28.212804 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:14:58 crc kubenswrapper[4918]: I0319 18:14:58.211683 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:14:58 crc kubenswrapper[4918]: I0319 18:14:58.213156 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.146135 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv"] Mar 19 18:15:00 crc kubenswrapper[4918]: E0319 18:15:00.147265 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693f90b7-68ff-47f9-800d-8f94e415107e" containerName="oc" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.147336 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="693f90b7-68ff-47f9-800d-8f94e415107e" containerName="oc" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.147618 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="693f90b7-68ff-47f9-800d-8f94e415107e" containerName="oc" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.148372 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.150949 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.151185 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.156547 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv"] Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.165749 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s45p4\" (UniqueName: \"kubernetes.io/projected/ed0131de-36b9-4fb3-a409-437a2d263fb4-kube-api-access-s45p4\") pod \"collect-profiles-29565735-99xrv\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.165984 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed0131de-36b9-4fb3-a409-437a2d263fb4-config-volume\") pod \"collect-profiles-29565735-99xrv\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.166064 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed0131de-36b9-4fb3-a409-437a2d263fb4-secret-volume\") pod \"collect-profiles-29565735-99xrv\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.268604 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed0131de-36b9-4fb3-a409-437a2d263fb4-config-volume\") pod \"collect-profiles-29565735-99xrv\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.268662 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed0131de-36b9-4fb3-a409-437a2d263fb4-secret-volume\") pod \"collect-profiles-29565735-99xrv\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.268717 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s45p4\" (UniqueName: \"kubernetes.io/projected/ed0131de-36b9-4fb3-a409-437a2d263fb4-kube-api-access-s45p4\") pod \"collect-profiles-29565735-99xrv\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.269988 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed0131de-36b9-4fb3-a409-437a2d263fb4-config-volume\") pod \"collect-profiles-29565735-99xrv\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.276136 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed0131de-36b9-4fb3-a409-437a2d263fb4-secret-volume\") pod \"collect-profiles-29565735-99xrv\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.285032 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s45p4\" (UniqueName: \"kubernetes.io/projected/ed0131de-36b9-4fb3-a409-437a2d263fb4-kube-api-access-s45p4\") pod \"collect-profiles-29565735-99xrv\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:00 crc kubenswrapper[4918]: I0319 18:15:00.466342 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:01 crc kubenswrapper[4918]: I0319 18:15:01.296482 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv"] Mar 19 18:15:01 crc kubenswrapper[4918]: I0319 18:15:01.979036 4918 generic.go:334] "Generic (PLEG): container finished" podID="ed0131de-36b9-4fb3-a409-437a2d263fb4" containerID="519cd262f5851f2a0f74e8efc5fecda2d416d39843cdc4923500295ebc5ea407" exitCode=0 Mar 19 18:15:01 crc kubenswrapper[4918]: I0319 18:15:01.979499 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" event={"ID":"ed0131de-36b9-4fb3-a409-437a2d263fb4","Type":"ContainerDied","Data":"519cd262f5851f2a0f74e8efc5fecda2d416d39843cdc4923500295ebc5ea407"} Mar 19 18:15:01 crc kubenswrapper[4918]: I0319 18:15:01.979551 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" event={"ID":"ed0131de-36b9-4fb3-a409-437a2d263fb4","Type":"ContainerStarted","Data":"a9fb9e1b670d7806bdb455cd78b4b254c1e8da57993e9dec5d33c5ee877be505"} Mar 19 18:15:03 crc kubenswrapper[4918]: I0319 18:15:03.998914 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" event={"ID":"ed0131de-36b9-4fb3-a409-437a2d263fb4","Type":"ContainerDied","Data":"a9fb9e1b670d7806bdb455cd78b4b254c1e8da57993e9dec5d33c5ee877be505"} Mar 19 18:15:03 crc kubenswrapper[4918]: I0319 18:15:03.999152 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9fb9e1b670d7806bdb455cd78b4b254c1e8da57993e9dec5d33c5ee877be505" Mar 19 18:15:04 crc kubenswrapper[4918]: I0319 18:15:04.070642 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:04 crc kubenswrapper[4918]: I0319 18:15:04.261986 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed0131de-36b9-4fb3-a409-437a2d263fb4-config-volume\") pod \"ed0131de-36b9-4fb3-a409-437a2d263fb4\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " Mar 19 18:15:04 crc kubenswrapper[4918]: I0319 18:15:04.262048 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed0131de-36b9-4fb3-a409-437a2d263fb4-secret-volume\") pod \"ed0131de-36b9-4fb3-a409-437a2d263fb4\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " Mar 19 18:15:04 crc kubenswrapper[4918]: I0319 18:15:04.262140 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s45p4\" (UniqueName: \"kubernetes.io/projected/ed0131de-36b9-4fb3-a409-437a2d263fb4-kube-api-access-s45p4\") pod \"ed0131de-36b9-4fb3-a409-437a2d263fb4\" (UID: \"ed0131de-36b9-4fb3-a409-437a2d263fb4\") " Mar 19 18:15:04 crc kubenswrapper[4918]: I0319 18:15:04.263857 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed0131de-36b9-4fb3-a409-437a2d263fb4-config-volume" (OuterVolumeSpecName: "config-volume") pod "ed0131de-36b9-4fb3-a409-437a2d263fb4" (UID: "ed0131de-36b9-4fb3-a409-437a2d263fb4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:15:04 crc kubenswrapper[4918]: I0319 18:15:04.271768 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0131de-36b9-4fb3-a409-437a2d263fb4-kube-api-access-s45p4" (OuterVolumeSpecName: "kube-api-access-s45p4") pod "ed0131de-36b9-4fb3-a409-437a2d263fb4" (UID: "ed0131de-36b9-4fb3-a409-437a2d263fb4"). InnerVolumeSpecName "kube-api-access-s45p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:15:04 crc kubenswrapper[4918]: I0319 18:15:04.278619 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0131de-36b9-4fb3-a409-437a2d263fb4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ed0131de-36b9-4fb3-a409-437a2d263fb4" (UID: "ed0131de-36b9-4fb3-a409-437a2d263fb4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:15:04 crc kubenswrapper[4918]: I0319 18:15:04.365191 4918 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed0131de-36b9-4fb3-a409-437a2d263fb4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 18:15:04 crc kubenswrapper[4918]: I0319 18:15:04.365451 4918 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed0131de-36b9-4fb3-a409-437a2d263fb4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 18:15:04 crc kubenswrapper[4918]: I0319 18:15:04.365462 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s45p4\" (UniqueName: \"kubernetes.io/projected/ed0131de-36b9-4fb3-a409-437a2d263fb4-kube-api-access-s45p4\") on node \"crc\" DevicePath \"\"" Mar 19 18:15:05 crc kubenswrapper[4918]: I0319 18:15:05.006437 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565735-99xrv" Mar 19 18:15:05 crc kubenswrapper[4918]: I0319 18:15:05.151421 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght"] Mar 19 18:15:05 crc kubenswrapper[4918]: I0319 18:15:05.161546 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565690-z6ght"] Mar 19 18:15:06 crc kubenswrapper[4918]: I0319 18:15:06.596215 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ea0181-35f4-4f23-a3cd-0e06013541b8" path="/var/lib/kubelet/pods/d3ea0181-35f4-4f23-a3cd-0e06013541b8/volumes" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.598531 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l2l6d"] Mar 19 18:15:15 crc kubenswrapper[4918]: E0319 18:15:15.599446 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0131de-36b9-4fb3-a409-437a2d263fb4" containerName="collect-profiles" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.599459 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0131de-36b9-4fb3-a409-437a2d263fb4" containerName="collect-profiles" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.599687 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0131de-36b9-4fb3-a409-437a2d263fb4" containerName="collect-profiles" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.612954 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.664693 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2l6d"] Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.678226 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdgtk\" (UniqueName: \"kubernetes.io/projected/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-kube-api-access-bdgtk\") pod \"redhat-operators-l2l6d\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.678422 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-catalog-content\") pod \"redhat-operators-l2l6d\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.678468 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-utilities\") pod \"redhat-operators-l2l6d\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.781380 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdgtk\" (UniqueName: \"kubernetes.io/projected/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-kube-api-access-bdgtk\") pod \"redhat-operators-l2l6d\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.781487 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-catalog-content\") pod \"redhat-operators-l2l6d\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.781548 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-utilities\") pod \"redhat-operators-l2l6d\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.782012 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-utilities\") pod \"redhat-operators-l2l6d\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.782432 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-catalog-content\") pod \"redhat-operators-l2l6d\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.816991 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdgtk\" (UniqueName: \"kubernetes.io/projected/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-kube-api-access-bdgtk\") pod \"redhat-operators-l2l6d\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:15 crc kubenswrapper[4918]: I0319 18:15:15.951442 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:16 crc kubenswrapper[4918]: I0319 18:15:16.721145 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2l6d"] Mar 19 18:15:17 crc kubenswrapper[4918]: I0319 18:15:17.175411 4918 generic.go:334] "Generic (PLEG): container finished" podID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerID="17c93dd97a1025ccc490bfce9aa2169d7dcc5405102db965e0b5d96a925347aa" exitCode=0 Mar 19 18:15:17 crc kubenswrapper[4918]: I0319 18:15:17.175545 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2l6d" event={"ID":"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4","Type":"ContainerDied","Data":"17c93dd97a1025ccc490bfce9aa2169d7dcc5405102db965e0b5d96a925347aa"} Mar 19 18:15:17 crc kubenswrapper[4918]: I0319 18:15:17.175689 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2l6d" event={"ID":"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4","Type":"ContainerStarted","Data":"b690b01ae9201ca9326616be16a9ce29546ed94fe4f383c295531b234ec03288"} Mar 19 18:15:17 crc kubenswrapper[4918]: I0319 18:15:17.371590 4918 scope.go:117] "RemoveContainer" containerID="54b39f4c468d2f3d6e22742c2414e446a575146c1d8ce0be316dd3e4ba69a534" Mar 19 18:15:17 crc kubenswrapper[4918]: I0319 18:15:17.393135 4918 scope.go:117] "RemoveContainer" containerID="c4bf4f429cb4a5ef6b12bc572841418261932194d696b0272b165b33c915539f" Mar 19 18:15:18 crc kubenswrapper[4918]: I0319 18:15:18.187737 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2l6d" event={"ID":"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4","Type":"ContainerStarted","Data":"2c6847c80769cd8a43cd37bcf21d76f4d361997f1e919a92306c7710e46d8f31"} Mar 19 18:15:24 crc kubenswrapper[4918]: I0319 18:15:24.255333 4918 generic.go:334] "Generic (PLEG): container finished" podID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerID="2c6847c80769cd8a43cd37bcf21d76f4d361997f1e919a92306c7710e46d8f31" exitCode=0 Mar 19 18:15:24 crc kubenswrapper[4918]: I0319 18:15:24.255436 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2l6d" event={"ID":"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4","Type":"ContainerDied","Data":"2c6847c80769cd8a43cd37bcf21d76f4d361997f1e919a92306c7710e46d8f31"} Mar 19 18:15:25 crc kubenswrapper[4918]: I0319 18:15:25.265930 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2l6d" event={"ID":"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4","Type":"ContainerStarted","Data":"6343728e43e0d84a37e77e4d3f55e3106ba8abded720be2060f4d0f591f428ee"} Mar 19 18:15:25 crc kubenswrapper[4918]: I0319 18:15:25.287253 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l2l6d" podStartSLOduration=2.840523176 podStartE2EDuration="10.287234514s" podCreationTimestamp="2026-03-19 18:15:15 +0000 UTC" firstStartedPulling="2026-03-19 18:15:17.177418592 +0000 UTC m=+5729.299617840" lastFinishedPulling="2026-03-19 18:15:24.62412993 +0000 UTC m=+5736.746329178" observedRunningTime="2026-03-19 18:15:25.283949325 +0000 UTC m=+5737.406148573" watchObservedRunningTime="2026-03-19 18:15:25.287234514 +0000 UTC m=+5737.409433762" Mar 19 18:15:25 crc kubenswrapper[4918]: I0319 18:15:25.952643 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:25 crc kubenswrapper[4918]: I0319 18:15:25.953022 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:27 crc kubenswrapper[4918]: I0319 18:15:27.009799 4918 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l2l6d" podUID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerName="registry-server" probeResult="failure" output=< Mar 19 18:15:27 crc kubenswrapper[4918]: timeout: failed to connect service ":50051" within 1s Mar 19 18:15:27 crc kubenswrapper[4918]: > Mar 19 18:15:28 crc kubenswrapper[4918]: I0319 18:15:28.211416 4918 patch_prober.go:28] interesting pod/machine-config-daemon-d4bjv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:15:28 crc kubenswrapper[4918]: I0319 18:15:28.211481 4918 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:15:28 crc kubenswrapper[4918]: I0319 18:15:28.211545 4918 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" Mar 19 18:15:28 crc kubenswrapper[4918]: I0319 18:15:28.212321 4918 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33"} pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 18:15:28 crc kubenswrapper[4918]: I0319 18:15:28.212376 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerName="machine-config-daemon" containerID="cri-o://0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" gracePeriod=600 Mar 19 18:15:28 crc kubenswrapper[4918]: E0319 18:15:28.337139 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:15:29 crc kubenswrapper[4918]: I0319 18:15:29.304368 4918 generic.go:334] "Generic (PLEG): container finished" podID="faff5e41-8f94-4bfd-9730-38955ab099d9" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" exitCode=0 Mar 19 18:15:29 crc kubenswrapper[4918]: I0319 18:15:29.304413 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" event={"ID":"faff5e41-8f94-4bfd-9730-38955ab099d9","Type":"ContainerDied","Data":"0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33"} Mar 19 18:15:29 crc kubenswrapper[4918]: I0319 18:15:29.304447 4918 scope.go:117] "RemoveContainer" containerID="3538084f1003e436351b0534c29b6fa462ec81baa17af0111e7937ea959e446c" Mar 19 18:15:29 crc kubenswrapper[4918]: I0319 18:15:29.305125 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:15:29 crc kubenswrapper[4918]: E0319 18:15:29.305441 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:15:36 crc kubenswrapper[4918]: I0319 18:15:36.013685 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:36 crc kubenswrapper[4918]: I0319 18:15:36.062544 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:36 crc kubenswrapper[4918]: I0319 18:15:36.249534 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2l6d"] Mar 19 18:15:37 crc kubenswrapper[4918]: I0319 18:15:37.379203 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l2l6d" podUID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerName="registry-server" containerID="cri-o://6343728e43e0d84a37e77e4d3f55e3106ba8abded720be2060f4d0f591f428ee" gracePeriod=2 Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.389669 4918 generic.go:334] "Generic (PLEG): container finished" podID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerID="6343728e43e0d84a37e77e4d3f55e3106ba8abded720be2060f4d0f591f428ee" exitCode=0 Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.389750 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2l6d" event={"ID":"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4","Type":"ContainerDied","Data":"6343728e43e0d84a37e77e4d3f55e3106ba8abded720be2060f4d0f591f428ee"} Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.392156 4918 generic.go:334] "Generic (PLEG): container finished" podID="603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" containerID="e2ef8149401114cc0d7930e8c2dbb5d5333e74c3bb6a72de6c7cd31bcb5912f4" exitCode=0 Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.392182 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-584bk/must-gather-tdc4n" event={"ID":"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0","Type":"ContainerDied","Data":"e2ef8149401114cc0d7930e8c2dbb5d5333e74c3bb6a72de6c7cd31bcb5912f4"} Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.392851 4918 scope.go:117] "RemoveContainer" containerID="e2ef8149401114cc0d7930e8c2dbb5d5333e74c3bb6a72de6c7cd31bcb5912f4" Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.674179 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.755643 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-catalog-content\") pod \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.756110 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdgtk\" (UniqueName: \"kubernetes.io/projected/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-kube-api-access-bdgtk\") pod \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.756136 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-utilities\") pod \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\" (UID: \"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4\") " Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.756886 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-utilities" (OuterVolumeSpecName: "utilities") pod "95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" (UID: "95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.757069 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.779739 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-kube-api-access-bdgtk" (OuterVolumeSpecName: "kube-api-access-bdgtk") pod "95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" (UID: "95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4"). InnerVolumeSpecName "kube-api-access-bdgtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.858990 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdgtk\" (UniqueName: \"kubernetes.io/projected/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-kube-api-access-bdgtk\") on node \"crc\" DevicePath \"\"" Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.904532 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" (UID: "95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:15:38 crc kubenswrapper[4918]: I0319 18:15:38.960738 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:15:39 crc kubenswrapper[4918]: I0319 18:15:39.013938 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-584bk_must-gather-tdc4n_603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0/gather/0.log" Mar 19 18:15:39 crc kubenswrapper[4918]: I0319 18:15:39.407006 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2l6d" event={"ID":"95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4","Type":"ContainerDied","Data":"b690b01ae9201ca9326616be16a9ce29546ed94fe4f383c295531b234ec03288"} Mar 19 18:15:39 crc kubenswrapper[4918]: I0319 18:15:39.407853 4918 scope.go:117] "RemoveContainer" containerID="6343728e43e0d84a37e77e4d3f55e3106ba8abded720be2060f4d0f591f428ee" Mar 19 18:15:39 crc kubenswrapper[4918]: I0319 18:15:39.407063 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2l6d" Mar 19 18:15:39 crc kubenswrapper[4918]: I0319 18:15:39.450350 4918 scope.go:117] "RemoveContainer" containerID="2c6847c80769cd8a43cd37bcf21d76f4d361997f1e919a92306c7710e46d8f31" Mar 19 18:15:39 crc kubenswrapper[4918]: I0319 18:15:39.463656 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2l6d"] Mar 19 18:15:39 crc kubenswrapper[4918]: I0319 18:15:39.490388 4918 scope.go:117] "RemoveContainer" containerID="17c93dd97a1025ccc490bfce9aa2169d7dcc5405102db965e0b5d96a925347aa" Mar 19 18:15:39 crc kubenswrapper[4918]: I0319 18:15:39.494245 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l2l6d"] Mar 19 18:15:40 crc kubenswrapper[4918]: I0319 18:15:40.600300 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" path="/var/lib/kubelet/pods/95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4/volumes" Mar 19 18:15:41 crc kubenswrapper[4918]: I0319 18:15:41.586563 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:15:41 crc kubenswrapper[4918]: E0319 18:15:41.586877 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:15:52 crc kubenswrapper[4918]: I0319 18:15:52.166565 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-584bk/must-gather-tdc4n"] Mar 19 18:15:52 crc kubenswrapper[4918]: I0319 18:15:52.167322 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-584bk/must-gather-tdc4n" podUID="603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" containerName="copy" containerID="cri-o://ba78511a239a8c6dac838187e939394f0e06b389cb5b4b1d5f86f7bffa2bfecf" gracePeriod=2 Mar 19 18:15:52 crc kubenswrapper[4918]: I0319 18:15:52.179653 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-584bk/must-gather-tdc4n"] Mar 19 18:15:52 crc kubenswrapper[4918]: I0319 18:15:52.543579 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-584bk_must-gather-tdc4n_603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0/copy/0.log" Mar 19 18:15:52 crc kubenswrapper[4918]: I0319 18:15:52.546820 4918 generic.go:334] "Generic (PLEG): container finished" podID="603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" containerID="ba78511a239a8c6dac838187e939394f0e06b389cb5b4b1d5f86f7bffa2bfecf" exitCode=143 Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.339555 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-584bk_must-gather-tdc4n_603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0/copy/0.log" Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.341045 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/must-gather-tdc4n" Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.450096 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jv5h\" (UniqueName: \"kubernetes.io/projected/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-kube-api-access-9jv5h\") pod \"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0\" (UID: \"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0\") " Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.450167 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-must-gather-output\") pod \"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0\" (UID: \"603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0\") " Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.458732 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-kube-api-access-9jv5h" (OuterVolumeSpecName: "kube-api-access-9jv5h") pod "603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" (UID: "603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0"). InnerVolumeSpecName "kube-api-access-9jv5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.552777 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jv5h\" (UniqueName: \"kubernetes.io/projected/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-kube-api-access-9jv5h\") on node \"crc\" DevicePath \"\"" Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.555334 4918 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-584bk_must-gather-tdc4n_603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0/copy/0.log" Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.555736 4918 scope.go:117] "RemoveContainer" containerID="ba78511a239a8c6dac838187e939394f0e06b389cb5b4b1d5f86f7bffa2bfecf" Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.555834 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-584bk/must-gather-tdc4n" Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.587683 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:15:53 crc kubenswrapper[4918]: E0319 18:15:53.588069 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.595663 4918 scope.go:117] "RemoveContainer" containerID="e2ef8149401114cc0d7930e8c2dbb5d5333e74c3bb6a72de6c7cd31bcb5912f4" Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.680391 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" (UID: "603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:15:53 crc kubenswrapper[4918]: I0319 18:15:53.761671 4918 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 19 18:15:54 crc kubenswrapper[4918]: I0319 18:15:54.619968 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" path="/var/lib/kubelet/pods/603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0/volumes" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.147905 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565736-tl6m2"] Mar 19 18:16:00 crc kubenswrapper[4918]: E0319 18:16:00.148873 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerName="extract-content" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.148887 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerName="extract-content" Mar 19 18:16:00 crc kubenswrapper[4918]: E0319 18:16:00.148899 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" containerName="copy" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.148905 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" containerName="copy" Mar 19 18:16:00 crc kubenswrapper[4918]: E0319 18:16:00.148925 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerName="registry-server" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.148931 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerName="registry-server" Mar 19 18:16:00 crc kubenswrapper[4918]: E0319 18:16:00.148953 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" containerName="gather" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.148958 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" containerName="gather" Mar 19 18:16:00 crc kubenswrapper[4918]: E0319 18:16:00.148974 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerName="extract-utilities" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.148980 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerName="extract-utilities" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.149167 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" containerName="copy" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.149186 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="95eb6c72-d7b5-4489-b2e4-f0abd9ee75f4" containerName="registry-server" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.149193 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="603f0a4b-84e8-4b51-8a6d-34ec8ef73ab0" containerName="gather" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.150024 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565736-tl6m2" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.152014 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.152246 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.155213 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.156503 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565736-tl6m2"] Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.265515 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qknhd\" (UniqueName: \"kubernetes.io/projected/543f2d7d-b515-4523-bb6e-fd31cdae4302-kube-api-access-qknhd\") pod \"auto-csr-approver-29565736-tl6m2\" (UID: \"543f2d7d-b515-4523-bb6e-fd31cdae4302\") " pod="openshift-infra/auto-csr-approver-29565736-tl6m2" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.367959 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qknhd\" (UniqueName: \"kubernetes.io/projected/543f2d7d-b515-4523-bb6e-fd31cdae4302-kube-api-access-qknhd\") pod \"auto-csr-approver-29565736-tl6m2\" (UID: \"543f2d7d-b515-4523-bb6e-fd31cdae4302\") " pod="openshift-infra/auto-csr-approver-29565736-tl6m2" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.386716 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qknhd\" (UniqueName: \"kubernetes.io/projected/543f2d7d-b515-4523-bb6e-fd31cdae4302-kube-api-access-qknhd\") pod \"auto-csr-approver-29565736-tl6m2\" (UID: \"543f2d7d-b515-4523-bb6e-fd31cdae4302\") " pod="openshift-infra/auto-csr-approver-29565736-tl6m2" Mar 19 18:16:00 crc kubenswrapper[4918]: I0319 18:16:00.466386 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565736-tl6m2" Mar 19 18:16:01 crc kubenswrapper[4918]: I0319 18:16:01.262040 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565736-tl6m2"] Mar 19 18:16:01 crc kubenswrapper[4918]: I0319 18:16:01.660877 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565736-tl6m2" event={"ID":"543f2d7d-b515-4523-bb6e-fd31cdae4302","Type":"ContainerStarted","Data":"d01654e5dce6f987c2e1c6b6b9d9b22bc48959cd5bd93f08d7dd724c4340b68e"} Mar 19 18:16:03 crc kubenswrapper[4918]: I0319 18:16:03.689734 4918 generic.go:334] "Generic (PLEG): container finished" podID="543f2d7d-b515-4523-bb6e-fd31cdae4302" containerID="0b151869109d4ba5436115937d6891669553489560a0bcb0789f023f28be1241" exitCode=0 Mar 19 18:16:03 crc kubenswrapper[4918]: I0319 18:16:03.689833 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565736-tl6m2" event={"ID":"543f2d7d-b515-4523-bb6e-fd31cdae4302","Type":"ContainerDied","Data":"0b151869109d4ba5436115937d6891669553489560a0bcb0789f023f28be1241"} Mar 19 18:16:05 crc kubenswrapper[4918]: I0319 18:16:05.875841 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565736-tl6m2" Mar 19 18:16:05 crc kubenswrapper[4918]: I0319 18:16:05.987700 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qknhd\" (UniqueName: \"kubernetes.io/projected/543f2d7d-b515-4523-bb6e-fd31cdae4302-kube-api-access-qknhd\") pod \"543f2d7d-b515-4523-bb6e-fd31cdae4302\" (UID: \"543f2d7d-b515-4523-bb6e-fd31cdae4302\") " Mar 19 18:16:05 crc kubenswrapper[4918]: I0319 18:16:05.992651 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543f2d7d-b515-4523-bb6e-fd31cdae4302-kube-api-access-qknhd" (OuterVolumeSpecName: "kube-api-access-qknhd") pod "543f2d7d-b515-4523-bb6e-fd31cdae4302" (UID: "543f2d7d-b515-4523-bb6e-fd31cdae4302"). InnerVolumeSpecName "kube-api-access-qknhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:16:06 crc kubenswrapper[4918]: I0319 18:16:06.089883 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qknhd\" (UniqueName: \"kubernetes.io/projected/543f2d7d-b515-4523-bb6e-fd31cdae4302-kube-api-access-qknhd\") on node \"crc\" DevicePath \"\"" Mar 19 18:16:06 crc kubenswrapper[4918]: I0319 18:16:06.714249 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565736-tl6m2" event={"ID":"543f2d7d-b515-4523-bb6e-fd31cdae4302","Type":"ContainerDied","Data":"d01654e5dce6f987c2e1c6b6b9d9b22bc48959cd5bd93f08d7dd724c4340b68e"} Mar 19 18:16:06 crc kubenswrapper[4918]: I0319 18:16:06.714565 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d01654e5dce6f987c2e1c6b6b9d9b22bc48959cd5bd93f08d7dd724c4340b68e" Mar 19 18:16:06 crc kubenswrapper[4918]: I0319 18:16:06.714527 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565736-tl6m2" Mar 19 18:16:06 crc kubenswrapper[4918]: I0319 18:16:06.950904 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565730-7ds7j"] Mar 19 18:16:06 crc kubenswrapper[4918]: I0319 18:16:06.982676 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565730-7ds7j"] Mar 19 18:16:08 crc kubenswrapper[4918]: I0319 18:16:08.592399 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:16:08 crc kubenswrapper[4918]: E0319 18:16:08.592952 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:16:08 crc kubenswrapper[4918]: I0319 18:16:08.599705 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c006477-ed74-4a21-b0c3-8ebebdb298bf" path="/var/lib/kubelet/pods/8c006477-ed74-4a21-b0c3-8ebebdb298bf/volumes" Mar 19 18:16:17 crc kubenswrapper[4918]: I0319 18:16:17.545195 4918 scope.go:117] "RemoveContainer" containerID="9aec64b73fdfe8189abe6c8aff009369bfe2a637c7112e669d7307490f43ad1a" Mar 19 18:16:19 crc kubenswrapper[4918]: I0319 18:16:19.586563 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:16:19 crc kubenswrapper[4918]: E0319 18:16:19.587342 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:16:34 crc kubenswrapper[4918]: I0319 18:16:34.586415 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:16:34 crc kubenswrapper[4918]: E0319 18:16:34.587089 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:16:49 crc kubenswrapper[4918]: I0319 18:16:49.586636 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:16:49 crc kubenswrapper[4918]: E0319 18:16:49.587394 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:17:03 crc kubenswrapper[4918]: I0319 18:17:03.586965 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:17:03 crc kubenswrapper[4918]: E0319 18:17:03.587568 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:17:18 crc kubenswrapper[4918]: I0319 18:17:18.593062 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:17:18 crc kubenswrapper[4918]: E0319 18:17:18.593783 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:17:30 crc kubenswrapper[4918]: I0319 18:17:30.586773 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:17:30 crc kubenswrapper[4918]: E0319 18:17:30.587506 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.625922 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-62rhc"] Mar 19 18:17:35 crc kubenswrapper[4918]: E0319 18:17:35.626732 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543f2d7d-b515-4523-bb6e-fd31cdae4302" containerName="oc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.626744 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="543f2d7d-b515-4523-bb6e-fd31cdae4302" containerName="oc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.626923 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="543f2d7d-b515-4523-bb6e-fd31cdae4302" containerName="oc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.628565 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.653879 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62rhc"] Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.688966 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-utilities\") pod \"redhat-marketplace-62rhc\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.689141 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-catalog-content\") pod \"redhat-marketplace-62rhc\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.689191 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn6db\" (UniqueName: \"kubernetes.io/projected/cf504f39-8161-45e1-a636-62f63d933794-kube-api-access-sn6db\") pod \"redhat-marketplace-62rhc\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.790585 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-utilities\") pod \"redhat-marketplace-62rhc\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.791075 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-utilities\") pod \"redhat-marketplace-62rhc\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.791317 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-catalog-content\") pod \"redhat-marketplace-62rhc\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.791389 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6db\" (UniqueName: \"kubernetes.io/projected/cf504f39-8161-45e1-a636-62f63d933794-kube-api-access-sn6db\") pod \"redhat-marketplace-62rhc\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.791855 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-catalog-content\") pod \"redhat-marketplace-62rhc\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.809500 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn6db\" (UniqueName: \"kubernetes.io/projected/cf504f39-8161-45e1-a636-62f63d933794-kube-api-access-sn6db\") pod \"redhat-marketplace-62rhc\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:35 crc kubenswrapper[4918]: I0319 18:17:35.945960 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:36 crc kubenswrapper[4918]: I0319 18:17:36.681497 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62rhc"] Mar 19 18:17:37 crc kubenswrapper[4918]: I0319 18:17:37.534103 4918 generic.go:334] "Generic (PLEG): container finished" podID="cf504f39-8161-45e1-a636-62f63d933794" containerID="da49c35a0cce85bd41d307b341d220bb7cf7434feedf91c7ea8c1fc1606e8446" exitCode=0 Mar 19 18:17:37 crc kubenswrapper[4918]: I0319 18:17:37.534200 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62rhc" event={"ID":"cf504f39-8161-45e1-a636-62f63d933794","Type":"ContainerDied","Data":"da49c35a0cce85bd41d307b341d220bb7cf7434feedf91c7ea8c1fc1606e8446"} Mar 19 18:17:37 crc kubenswrapper[4918]: I0319 18:17:37.534662 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62rhc" event={"ID":"cf504f39-8161-45e1-a636-62f63d933794","Type":"ContainerStarted","Data":"48d75fd2f0fed2b1bbb03cbc9b4ade439c42c4b8016d616c7753d8bac91332dc"} Mar 19 18:17:37 crc kubenswrapper[4918]: I0319 18:17:37.535565 4918 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 18:17:38 crc kubenswrapper[4918]: I0319 18:17:38.549596 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62rhc" event={"ID":"cf504f39-8161-45e1-a636-62f63d933794","Type":"ContainerStarted","Data":"a113a17cba86d3643af8699bad0fa2d4b6df311880c0866dd945128e51514e6a"} Mar 19 18:17:39 crc kubenswrapper[4918]: I0319 18:17:39.575988 4918 generic.go:334] "Generic (PLEG): container finished" podID="cf504f39-8161-45e1-a636-62f63d933794" containerID="a113a17cba86d3643af8699bad0fa2d4b6df311880c0866dd945128e51514e6a" exitCode=0 Mar 19 18:17:39 crc kubenswrapper[4918]: I0319 18:17:39.576091 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62rhc" event={"ID":"cf504f39-8161-45e1-a636-62f63d933794","Type":"ContainerDied","Data":"a113a17cba86d3643af8699bad0fa2d4b6df311880c0866dd945128e51514e6a"} Mar 19 18:17:40 crc kubenswrapper[4918]: I0319 18:17:40.604828 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62rhc" event={"ID":"cf504f39-8161-45e1-a636-62f63d933794","Type":"ContainerStarted","Data":"74f975f53351ea41d0b36c4abadc25653533e7dae83df0957ba7cd9a29504b8f"} Mar 19 18:17:40 crc kubenswrapper[4918]: I0319 18:17:40.609680 4918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-62rhc" podStartSLOduration=3.19029623 podStartE2EDuration="5.609661085s" podCreationTimestamp="2026-03-19 18:17:35 +0000 UTC" firstStartedPulling="2026-03-19 18:17:37.535337477 +0000 UTC m=+5869.657536725" lastFinishedPulling="2026-03-19 18:17:39.954702332 +0000 UTC m=+5872.076901580" observedRunningTime="2026-03-19 18:17:40.606224941 +0000 UTC m=+5872.728424189" watchObservedRunningTime="2026-03-19 18:17:40.609661085 +0000 UTC m=+5872.731860333" Mar 19 18:17:41 crc kubenswrapper[4918]: I0319 18:17:41.586671 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:17:41 crc kubenswrapper[4918]: E0319 18:17:41.587546 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:17:45 crc kubenswrapper[4918]: I0319 18:17:45.946487 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:45 crc kubenswrapper[4918]: I0319 18:17:45.947124 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:46 crc kubenswrapper[4918]: I0319 18:17:46.014907 4918 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:46 crc kubenswrapper[4918]: I0319 18:17:46.691070 4918 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:46 crc kubenswrapper[4918]: I0319 18:17:46.752448 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62rhc"] Mar 19 18:17:48 crc kubenswrapper[4918]: I0319 18:17:48.656841 4918 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-62rhc" podUID="cf504f39-8161-45e1-a636-62f63d933794" containerName="registry-server" containerID="cri-o://74f975f53351ea41d0b36c4abadc25653533e7dae83df0957ba7cd9a29504b8f" gracePeriod=2 Mar 19 18:17:49 crc kubenswrapper[4918]: I0319 18:17:49.680393 4918 generic.go:334] "Generic (PLEG): container finished" podID="cf504f39-8161-45e1-a636-62f63d933794" containerID="74f975f53351ea41d0b36c4abadc25653533e7dae83df0957ba7cd9a29504b8f" exitCode=0 Mar 19 18:17:49 crc kubenswrapper[4918]: I0319 18:17:49.680472 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62rhc" event={"ID":"cf504f39-8161-45e1-a636-62f63d933794","Type":"ContainerDied","Data":"74f975f53351ea41d0b36c4abadc25653533e7dae83df0957ba7cd9a29504b8f"} Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.217585 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.298094 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-utilities\") pod \"cf504f39-8161-45e1-a636-62f63d933794\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.298510 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-catalog-content\") pod \"cf504f39-8161-45e1-a636-62f63d933794\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.298683 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn6db\" (UniqueName: \"kubernetes.io/projected/cf504f39-8161-45e1-a636-62f63d933794-kube-api-access-sn6db\") pod \"cf504f39-8161-45e1-a636-62f63d933794\" (UID: \"cf504f39-8161-45e1-a636-62f63d933794\") " Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.299107 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-utilities" (OuterVolumeSpecName: "utilities") pod "cf504f39-8161-45e1-a636-62f63d933794" (UID: "cf504f39-8161-45e1-a636-62f63d933794"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.305886 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf504f39-8161-45e1-a636-62f63d933794-kube-api-access-sn6db" (OuterVolumeSpecName: "kube-api-access-sn6db") pod "cf504f39-8161-45e1-a636-62f63d933794" (UID: "cf504f39-8161-45e1-a636-62f63d933794"). InnerVolumeSpecName "kube-api-access-sn6db". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.322965 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf504f39-8161-45e1-a636-62f63d933794" (UID: "cf504f39-8161-45e1-a636-62f63d933794"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.400842 4918 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.400889 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn6db\" (UniqueName: \"kubernetes.io/projected/cf504f39-8161-45e1-a636-62f63d933794-kube-api-access-sn6db\") on node \"crc\" DevicePath \"\"" Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.400902 4918 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf504f39-8161-45e1-a636-62f63d933794-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.695597 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62rhc" event={"ID":"cf504f39-8161-45e1-a636-62f63d933794","Type":"ContainerDied","Data":"48d75fd2f0fed2b1bbb03cbc9b4ade439c42c4b8016d616c7753d8bac91332dc"} Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.695676 4918 scope.go:117] "RemoveContainer" containerID="74f975f53351ea41d0b36c4abadc25653533e7dae83df0957ba7cd9a29504b8f" Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.695679 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62rhc" Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.721007 4918 scope.go:117] "RemoveContainer" containerID="a113a17cba86d3643af8699bad0fa2d4b6df311880c0866dd945128e51514e6a" Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.726578 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62rhc"] Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.741064 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-62rhc"] Mar 19 18:17:50 crc kubenswrapper[4918]: I0319 18:17:50.749725 4918 scope.go:117] "RemoveContainer" containerID="da49c35a0cce85bd41d307b341d220bb7cf7434feedf91c7ea8c1fc1606e8446" Mar 19 18:17:52 crc kubenswrapper[4918]: I0319 18:17:52.611025 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf504f39-8161-45e1-a636-62f63d933794" path="/var/lib/kubelet/pods/cf504f39-8161-45e1-a636-62f63d933794/volumes" Mar 19 18:17:55 crc kubenswrapper[4918]: I0319 18:17:55.587108 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:17:55 crc kubenswrapper[4918]: E0319 18:17:55.588132 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.156630 4918 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565738-xwfxr"] Mar 19 18:18:00 crc kubenswrapper[4918]: E0319 18:18:00.157709 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf504f39-8161-45e1-a636-62f63d933794" containerName="extract-content" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.157731 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf504f39-8161-45e1-a636-62f63d933794" containerName="extract-content" Mar 19 18:18:00 crc kubenswrapper[4918]: E0319 18:18:00.157773 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf504f39-8161-45e1-a636-62f63d933794" containerName="extract-utilities" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.157786 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf504f39-8161-45e1-a636-62f63d933794" containerName="extract-utilities" Mar 19 18:18:00 crc kubenswrapper[4918]: E0319 18:18:00.157811 4918 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf504f39-8161-45e1-a636-62f63d933794" containerName="registry-server" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.157823 4918 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf504f39-8161-45e1-a636-62f63d933794" containerName="registry-server" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.158099 4918 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf504f39-8161-45e1-a636-62f63d933794" containerName="registry-server" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.159061 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565738-xwfxr" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.162213 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.162567 4918 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f46n" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.163070 4918 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.188681 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565738-xwfxr"] Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.318165 4918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2h5q\" (UniqueName: \"kubernetes.io/projected/bd19d510-b6a9-4da6-87e2-3208c0aab945-kube-api-access-w2h5q\") pod \"auto-csr-approver-29565738-xwfxr\" (UID: \"bd19d510-b6a9-4da6-87e2-3208c0aab945\") " pod="openshift-infra/auto-csr-approver-29565738-xwfxr" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.419313 4918 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2h5q\" (UniqueName: \"kubernetes.io/projected/bd19d510-b6a9-4da6-87e2-3208c0aab945-kube-api-access-w2h5q\") pod \"auto-csr-approver-29565738-xwfxr\" (UID: \"bd19d510-b6a9-4da6-87e2-3208c0aab945\") " pod="openshift-infra/auto-csr-approver-29565738-xwfxr" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.438799 4918 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2h5q\" (UniqueName: \"kubernetes.io/projected/bd19d510-b6a9-4da6-87e2-3208c0aab945-kube-api-access-w2h5q\") pod \"auto-csr-approver-29565738-xwfxr\" (UID: \"bd19d510-b6a9-4da6-87e2-3208c0aab945\") " pod="openshift-infra/auto-csr-approver-29565738-xwfxr" Mar 19 18:18:00 crc kubenswrapper[4918]: I0319 18:18:00.486840 4918 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565738-xwfxr" Mar 19 18:18:01 crc kubenswrapper[4918]: I0319 18:18:01.024170 4918 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565738-xwfxr"] Mar 19 18:18:01 crc kubenswrapper[4918]: I0319 18:18:01.836771 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565738-xwfxr" event={"ID":"bd19d510-b6a9-4da6-87e2-3208c0aab945","Type":"ContainerStarted","Data":"b2564f1d745a6445ff42cb8fbe09296186d7ada1d9b67c5349aeb3df51b91afe"} Mar 19 18:18:02 crc kubenswrapper[4918]: I0319 18:18:02.848932 4918 generic.go:334] "Generic (PLEG): container finished" podID="bd19d510-b6a9-4da6-87e2-3208c0aab945" containerID="6b8e10c65fc28e57d29113e246ea8d439e8382af2d5e9d5fb3467254c51acfc7" exitCode=0 Mar 19 18:18:02 crc kubenswrapper[4918]: I0319 18:18:02.849023 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565738-xwfxr" event={"ID":"bd19d510-b6a9-4da6-87e2-3208c0aab945","Type":"ContainerDied","Data":"6b8e10c65fc28e57d29113e246ea8d439e8382af2d5e9d5fb3467254c51acfc7"} Mar 19 18:18:04 crc kubenswrapper[4918]: I0319 18:18:04.801504 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565738-xwfxr" Mar 19 18:18:04 crc kubenswrapper[4918]: I0319 18:18:04.873492 4918 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565738-xwfxr" event={"ID":"bd19d510-b6a9-4da6-87e2-3208c0aab945","Type":"ContainerDied","Data":"b2564f1d745a6445ff42cb8fbe09296186d7ada1d9b67c5349aeb3df51b91afe"} Mar 19 18:18:04 crc kubenswrapper[4918]: I0319 18:18:04.873860 4918 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2564f1d745a6445ff42cb8fbe09296186d7ada1d9b67c5349aeb3df51b91afe" Mar 19 18:18:04 crc kubenswrapper[4918]: I0319 18:18:04.873619 4918 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565738-xwfxr" Mar 19 18:18:04 crc kubenswrapper[4918]: I0319 18:18:04.911170 4918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2h5q\" (UniqueName: \"kubernetes.io/projected/bd19d510-b6a9-4da6-87e2-3208c0aab945-kube-api-access-w2h5q\") pod \"bd19d510-b6a9-4da6-87e2-3208c0aab945\" (UID: \"bd19d510-b6a9-4da6-87e2-3208c0aab945\") " Mar 19 18:18:04 crc kubenswrapper[4918]: I0319 18:18:04.919885 4918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd19d510-b6a9-4da6-87e2-3208c0aab945-kube-api-access-w2h5q" (OuterVolumeSpecName: "kube-api-access-w2h5q") pod "bd19d510-b6a9-4da6-87e2-3208c0aab945" (UID: "bd19d510-b6a9-4da6-87e2-3208c0aab945"). InnerVolumeSpecName "kube-api-access-w2h5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:18:05 crc kubenswrapper[4918]: I0319 18:18:05.014024 4918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2h5q\" (UniqueName: \"kubernetes.io/projected/bd19d510-b6a9-4da6-87e2-3208c0aab945-kube-api-access-w2h5q\") on node \"crc\" DevicePath \"\"" Mar 19 18:18:05 crc kubenswrapper[4918]: I0319 18:18:05.888068 4918 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565732-np728"] Mar 19 18:18:05 crc kubenswrapper[4918]: I0319 18:18:05.901688 4918 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565732-np728"] Mar 19 18:18:06 crc kubenswrapper[4918]: I0319 18:18:06.588728 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:18:06 crc kubenswrapper[4918]: E0319 18:18:06.589035 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:18:06 crc kubenswrapper[4918]: I0319 18:18:06.598682 4918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b50232-b554-4dd1-b6a5-6fd9362afeb8" path="/var/lib/kubelet/pods/d0b50232-b554-4dd1-b6a5-6fd9362afeb8/volumes" Mar 19 18:18:17 crc kubenswrapper[4918]: I0319 18:18:17.723240 4918 scope.go:117] "RemoveContainer" containerID="ca5c3004c3bf24e2b95fabe7ff0ac085df33e943254d23b52a29c311ac87380c" Mar 19 18:18:18 crc kubenswrapper[4918]: I0319 18:18:18.593435 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:18:18 crc kubenswrapper[4918]: E0319 18:18:18.593884 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:18:31 crc kubenswrapper[4918]: I0319 18:18:31.586772 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:18:31 crc kubenswrapper[4918]: E0319 18:18:31.587642 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:18:43 crc kubenswrapper[4918]: I0319 18:18:43.586561 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:18:43 crc kubenswrapper[4918]: E0319 18:18:43.587278 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:18:57 crc kubenswrapper[4918]: I0319 18:18:57.587133 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:18:57 crc kubenswrapper[4918]: E0319 18:18:57.588034 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:19:10 crc kubenswrapper[4918]: I0319 18:19:10.586811 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:19:10 crc kubenswrapper[4918]: E0319 18:19:10.587557 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9" Mar 19 18:19:22 crc kubenswrapper[4918]: I0319 18:19:22.587574 4918 scope.go:117] "RemoveContainer" containerID="0419f168b1b1c21841f70ce6ec71f90116f8855c4603a3f311bfc5e81455bf33" Mar 19 18:19:22 crc kubenswrapper[4918]: E0319 18:19:22.588273 4918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-d4bjv_openshift-machine-config-operator(faff5e41-8f94-4bfd-9730-38955ab099d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-d4bjv" podUID="faff5e41-8f94-4bfd-9730-38955ab099d9"